Smart Sensor Technologies The Texas A&M team has engaged in significant sensor and sensor fusion research over the past few years [patents and publications]. One focus of our sensors and sensor fusion work has been to address the demands of realtime, high resolution, and high reliability systems; we seek optimal integrated designs of advanced sensors, fusion algorithms, and computing architectures to solve particular set of problems. The integration of sensor design, sensor fusion algorithms and advanced computing results in a class of embedded systems we refer to as smart sensors. As a specific example, we consider the VisNav sensor which we developed to enable autonomous aerial refueling, see Figure D (in VisNav page). Over the recent few years, we have researched and developed this concept from a purely paper design through analysis, computation, and experiment, followed by a sequence of successful prototypes that are presently being integrated for flight experiments [the first time two fully autonomous UAVs will hook up in an aerial refueling fight mode]. This technology has been licensed to three industrial partners who are now leading this effort: These three industrial partners have teamed with the our faculty and each other to advance the technology readiness of VisNav, integrate it into an advanced adaptive flight control system, and bring this autonomous aerial refueling technology to maturity for the flight tests scheduled in the second quarter of 2007. We briefly review the macroscopic features of the sensor concept and the sensor fusion algorithm architecture, which holds the promise of enabling a disruptive technology: True 247 persistence for UAV missions. Further, we envision UAVs with aerial refueling capability as being one set of elements in a systemofsystem, netcentric approach to future combat systems. Among the high end elements could be the F35, airborne radar and other high end air vehicles, space assets, and a constellation of communication assets. For details of the VisNav Sensor System, click here >> 

Sensor Fusion Algorithms : Architectures 

Our VisNav based sensor fusion scheme is briefly described here. A four element cascade of fusion filters operate on the measurements in real time, as described below. These filters are all implemented in the same processor that is embedded with the VisNav sensor itself. 

Filter 1.1 IR Energy Discrimination (output @ 1000 Hz) 

Filter 1.2 Energy Centroid (line of sight) Estimation (output @ 100 HZ) 

Filter 1.3 Geometric 6 DOF Relative Pose Estimation (output @ 100 HZ) 

Figure E : VisNav Sensor Model 

Figure F: VisNav Sensor and Kalman Filter Performance 
Filter 1.4 Dynamic Extended Kalman Fliter (output @ 100 HZ) Given additional kinematics measurements of linear and angular motion and/or a dynamic model, the nonlinear EKF algorithm can filter the geometric 6 DOF position and orientation estimates output from Filter 1.3 to obtain the best estimates of the relative 6 DOF coordinates and their rates, to use in control algorithms. Convergence requires only a few seconds and the typical position errors are 0.01m and angular errors are typically 0.05 degrees for a separation range of 10m or less. 
This sensor system, unlike a differential GPS system (subject to shadowing and multipath issues) has its best performance in the end game where the accuracy, bandwidth and reliability are most needed. Since it is known that the bandwidth of the controller seldom needs to exceed 25HZ for aerial refueling, we have found that 100HZ updates with a latency of .01sec is more than sufficient to utilize VisNav as the primary relative navigation sensor. The actual bandwidth required is depending upon the vehicles and the disturbance spectrum adopted as the nominal operational constraint. 

Figure G : Cascade/Parallel Sensor Fusion Architecture 

With reference to Figure G, we discuss a Cascade/Parallel Fusion Architecture (CPFA). Note that the VisNav sensor fusion algorithm could be represented by the left most set of cascaded filters, but that each cascade can be operating in parallel with other cascades of filters dedicated to other sensed information. The details can be pursued in the preprint of our preliminary research [18 ]. 

The filter cascade may, in some cases, be implemented in an embedded processor for particular sensors. In particular, this is the approach adopted for VisNav, and in point of fact, this architecture captures many features of the salient features of the sensor fusion approach implemented on the F35. While this architecture is heuristic and “makes sense,” we have initiated a study of the optimality and computational efficiency of this class of data fusion architectures. In this study [68], we first consider the case that all filters and models are linear, and we also study the most general case where all of the filters are nonlinear and asynchronous. Under ideal and linear circumstances, we have proven that the optimal result is realized if all of the filters in the cascades and the main fusion filter are variants of the Kalman Filter. Perhaps not surprisingly, we have also found that the observability and convergence issues are complicated functions of the particular system and the sensors utilized. The observability of the cascaded filters is guaranteed under more restrictive conditions than if all measurements are processed in a single EKF algorithm. Thus each cascaded estimation process must be evaluated to ensure this decomposition is robust for particular dynamical systems and sensor subsystems. Of primary concern is the effect of the latency on stability of the controlled system. We have initiated research [6, 3, 12] on the effects of latency closed loop stability, and anticipate coupling these studies with latencies and state estimation errors from state estimation utilizing various algorithms and fusion architectures. 

Patents:


Publications :

