Home | Members | Research | Facilities | Publications | Software | News | Contact


 

   

Smart Sensor Technologies

The Texas A&M team has engaged in significant sensor and sensor fusion research over the past few years [patents and publications].   One focus of our sensors and sensor fusion work has been to address the demands of real-time, high resolution, and high reliability systems; we seek optimal integrated designs of advanced sensors, fusion algorithms, and computing architectures to solve particular set of problems. The integration of sensor design, sensor fusion algorithms and advanced computing results in a class of embedded systems we refer to as smart sensors.

As a specific example, we consider the VisNav sensor which we developed to enable autonomous aerial refueling, see Figure D (in VisNav page).  Over the recent few years, we have researched and developed this concept from a purely paper design through analysis, computation, and experiment, followed by a sequence of successful prototypes that are presently being integrated for flight experiments [the first time two fully autonomous UAVs will hook up in an aerial refueling fight mode].  This technology has been licensed to three industrial partners who are now leading this effort:

  1. Sargent Fletcher
  2. Cobham, PLC
  3. StarVision Technologies

These three industrial partners have teamed with the our faculty and each other to advance the technology readiness of VisNav, integrate it into an advanced adaptive flight control system, and bring this autonomous aerial re-fueling technology to maturity for the flight tests scheduled in the second quarter of 2007.  We briefly review the macroscopic features of the sensor concept and the sensor fusion algorithm architecture, which holds the promise of enabling a disruptive technology: True 24-7 persistence for UAV missions.

Further, we envision UAVs with aerial refueling capability as being one set of elements in a system-of-system, net-centric approach to future combat systems.  Among the high end elements could be the F-35, airborne radar and other high end air vehicles, space assets, and a constellation of communication assets. 

For details of the VisNav Sensor System, click here >>

Sensor Fusion Algorithms : Architectures

Our VisNav based sensor fusion scheme is briefly described here.  A four element cascade of fusion filters operate on the measurements in real time, as described below.  These filters are all implemented in the same processor that is embedded with the VisNav sensor itself.

Filter 1.1 IR Energy Discrimination (output @ 1000 Hz)
The VisNav analog position sensitive photo-detector signals are passed through a matched analog filter to extract only the incident energy having the prescribed waveform of the active beacon; this allows near certain discrimination in a noisy energy environment. The beacons are commanded individually by the VisNav sensor’s dedicated processor (on the receiver aircraft) to adjust the transmitted energy to optimize the actual received energy signal to noise.  The dwell time required to reliably measure the line of sight direction to each beacon is 0.001sec. 

Filter 1.2 Energy Centroid (line of sight) Estimation (output @ 100 HZ)
The VisNav photodetector analog signals from Filter 1.1 are over-sampled at 1000 HZ and filtered to both minimize estimation error and estimate the covariance of the noise on the centroid measurements that define the line of sight measurements to each beacon.

Filter 1.3 Geometric 6 DOF Relative Pose Estimation (output @ 100 HZ)
Given a redundant set of 4 or more measured line of sight measurements from Filter 1.2 to known beacons, the nonlinear mapping from object to image space for the VisNav optics can be inverted uniquely by least square differential correction [2] (shown in [5,6,7] to robustly converge) to obtain the instantaneous least square geometric best estimate of the six degree of freedom relative position of the two aircraft (consistent with the measured vectors from the VisNav sensor on the receiver aircraft to the beacons on the target aircraft).  The least square estimate of the 6 relative position and orientation coordinates and the associated 6x6 covariance matrix are output 100 times/sec, yielding a computational latency of less than 0.01 sec. in a Pentium III class processor.

Sensor Model

Figure E : VisNav Sensor Model

VisNav Sensor and Kalman Filter Performance

Figure F: VisNav Sensor and Kalman Filter Performance

 

Filter 1.4 Dynamic Extended Kalman Fliter (output @ 100 HZ)

Given additional kinematics measurements of linear and angular motion and/or a dynamic model, the nonlinear EKF algorithm can filter the geometric 6 DOF position and orientation estimates output from Filter 1.3 to obtain the best estimates of the relative 6 DOF coordinates and their rates, to use in control algorithms. Convergence requires only a few seconds and the typical position errors are 0.01m and angular errors are typically 0.05 degrees for a separation range of 10m or less. 
This sensor system, unlike a differential GPS system (subject to shadowing and multi-path issues) has its best performance in the end game where the accuracy, bandwidth and reliability are most needed. Since it is known that the bandwidth of the controller seldom needs to exceed 25HZ for aerial refueling, we have found that 100HZ updates with a latency of .01sec is more than sufficient to utilize VisNav as the primary relative navigation sensor.   The actual bandwidth required is depending upon the vehicles and the disturbance spectrum adopted as the nominal operational constraint.
Sensor Fusion Architecture
Figure G : Cascade/Parallel Sensor Fusion Architecture
With reference to Figure G, we discuss a Cascade/Parallel Fusion Architecture (CPFA).  Note that the VisNav sensor fusion algorithm could be represented by the left most set of cascaded filters, but that each cascade can be operating in parallel with other cascades of filters dedicated to other sensed information.  The details can be pursued in the preprint of our preliminary research [18 ].
The filter cascade may, in some cases, be implemented in an embedded processor for particular sensors.  In particular, this is the approach adopted for VisNav, and in point of fact, this architecture captures many features of the salient features of the sensor fusion approach implemented on the F-35.  While this architecture is heuristic and “makes sense,” we have initiated a study of the optimality and computational efficiency of this class of data fusion architectures.  In this study [68], we first consider the case that all filters and models are linear, and we also study the most general case where all of the filters are nonlinear and asynchronous.  Under ideal and linear circumstances, we have proven that the optimal result is realized if all of the filters in the cascades and the main fusion filter are variants of the Kalman Filter.  Perhaps not surprisingly, we have also found that the observability and convergence issues are complicated functions of the particular system and the sensors utilized.  The observability of the cascaded filters is guaranteed under more restrictive conditions than if all measurements are processed in a single EKF algorithm.  Thus each cascaded estimation process must be evaluated to ensure this decomposition is robust for particular dynamical systems and sensor sub-systems.  Of primary concern is the effect of the latency on stability of the controlled system.  We have initiated research [6, 3, 12] on the effects of latency closed loop stability, and anticipate coupling these studies with latencies and state estimation errors from state estimation utilizing various algorithms and fusion architectures.

Patents:

  1. J. L. Junkins, H. Schaub, D. Hughes, “Noncontact Position and Orientation Measurement System and Method,” U.S. Patent No. 6,266,142 B1, July 24, 2001.
  2. Garcia, F. L., Jr., Junkins J.L., and Valasek, J., "Method and Apparatus for Hookup of unmanned/Manned Multi-purpose (HUM) Air Vechicles," U.S. Patent No. 7,152,828.

Publications :

  1. Singla, P. and Junkins, J. L., Adaptive Multiresolution Modeling, Estimation and Control of High Dimensioned Nonlinear Systems, CRC Press, to appear 2007.
  2. Crassidis, J. L.. and Junkins, J. L., Optimal Estimation of Dynamic Systems, 591 pp., CRC Press, Boca Raton, FL., 2004..
  3. Doebbler, James, Monda, Mark, Valasek, John, and Schaub, Hanspeter, "Boom and Receptacle Autonomous Air Refueling Using a Visual Pressure Snake Optical Sensor," -to appear, Journal of Guidance Control and Dynamics.
  4. S. G. Kim, J. L. Crassidis, Y. Cheng, A. M. Fosbury, and John L. Junkins, "Kalman filtering for relative spacecraft attitude and position estimation," Journal of Guidance Control and Dynamics, 30:133–143, January 2007.
  5. Tandale, Monish D., Bowers, Roshawn, and Valasek, John, “Robust Trajectory Tracking Controller for Vision Based Autonomous Aerial Refueling of Unmanned Aircraft,” Journal of Guidance, Control, and Dynamics, Voume 29, Number 4, July-August, 2006, pp. 846-857.
  6. Valasek, J., Gunnam, K., Kimmett, J., Tandale, M. D., Junkins, J. L., and Hughes, D., “Vision-Based Sensor and Navigation System for Autonomous Air Refueling,” The Journal of Guidance Control and Dynamics, Vol. 28, April 2005, pp. 979–989.
  7. Gunnam, K., Hughes, D., Junkins, J. L., and Khetarnaraz, N., “A Vision Based DSP Embedded Navigation Sensor,” IEEE Journal of Sensors, Vol. 2, October 2002, pp. 428–442.
  8. Suman Chakravorty and John L. Junkins. "Intelligent path planning in unknown environments with vision like sensors," Automatica, NA:–, January Under review.
  9. Chakravorty, S. and Junkins, J. L., “Intelligent Exploration of Unknown Environments with Vision Like Sensors,” International Conference on Advanced Intelligent Mechatronics, Monterey, California, 24-28 July 2005.
  10. Singla, P., Subbarao, K., Hughes, D., and Junkins, J. L., “Structured Model Reference Adaptive Control For Vision Based Spacecraft Rendezvous And Docking,” 13th Annual AAS/AIAA Space Flight Mechanics Meeting, Ponce, Puerto Rico, 9-13, February 2003.
  11. Gunnam, K., Hughes, D., Junkins, J. L., and Khetarnaraz, N., “A Vision Based DSP Embedded Navigation Sensor,” International Conference on Signal Processing, ICSP, 2002, Beijing, China, Vol. 2, August, 26-30, 2002.
  12. Kimmett, Jennifer, Valasek, John, and Junkins, John L., “Vision Based Controller for Autonomous Aerial Refueling,” CCA02-CCAREG-1126, Proceedings of the IEEE Control Systems Society Conference on Control Applications, Glasgow, Scotland, 18-20 September 2002.
  13. Kimmett, Jennifer, Valasek, John, and Junkins, John L., “Autonomous Aerial Refueling Utilizing a Vision Based Navigation System ,” AIAA-2002-4469, Proceedings of the AIAA Guidance Navigation and Control Conference, Monterey, California, 5-8 August 2002.
  14. Valasek, John, Kimmett, Jennifer, Hughes, Declan, Gunnam, Kiran, and Junkins, John L., "Vision Based Sensor and Navigation System for Autonomous Aerial Refueling," AIAA-2002-3441, 1st AIAA Unmanned Aerospace Vehicles, Systems, Technologies, and Operations Conference, Portsmouth, VA, 20-22 May 2002.
  15. Valasek, John and Junkins, John L., “Intelligent Control Systems and Vision Based Navigation to Enable Autonomous Aerial Refueling of UAVs,” 04-012, Proceedings of the 27th Annual American Astronautical Society Guidance and Control Conference, Breckenridge, CO, 4-8 February 2004
  16. Valasek, John, "High Fidelity Flight Simulation of Autonomous Air Refueling Using a Vision Based Sensor," Final Technical Report, StarVision Technologies Incorporated, December 2004.
  17. Valasek, John, Junkins, John, Lund, David W., and Ward, Donald T., "Autonomous Aerial Refueling Demonstration: Phase 0," Final Technical Report, 32525-20270 AE, 30 June 2004.
  18. John L. Junkins, Majji, M.and Davis, J., J., "Hierarchical Multi-rate Measurement Fusion for Estimation of Dynamical Systems," Preprint, NA:–, Under review.
 

 

 



 


© Aerospace Engineering, Texas A&M University