Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
32 views8 pages

Sci 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views8 pages

Sci 1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Sensor Fusion: A Review of Methods and Applications

Man Lok Fung, Michael Z. Q. Chen, Yong Hua Chen


1. Department of Mechanical Engineering, University of Hong Kong
E-mail: [email protected]

Abstract: This paper aims to present a brief overview of the development of sensor fusion in various application in
recent years, and to understand the challenges and ability of sensor fusion. Various algorithms that are typically employed
are covered to comprehend the complexity of usage in different scenarios.
Key Words: Multi-sensor fusion; fusion algorithm; fusion applications

1. INTRODUCTION 2.1 Characteristics of Sensors


Most sensors do not directly generate a signal from an
Sensor fusion had been a fast developing area of research
external phenomenon, but via several conversion steps.
in recent years. With the increase of the availability of the
Thus, the output that is read by the user may deviate from
numbers and types of sensors, the need to manage the
the actual input, and these performance-related parameters,
increasing quantity of information has produced the need to
or specifications provides information about the deviation
fuse such data for human to perceive. The ability to
from the ideal behavior. There are static characteristics like
combine information and integrate them allows for new
accuracy, precision, resolution and sensitivity. Typically,
capability in myriads of areas. Some examples where sensor
these can be easily managed before fusion process.
fusion is now widely engaged in different methods, are
Dynamic characteristics however, varies between changes
automotive automation, mobile robot navigation, and target
of input. The speed and frequency of response, settling time
tracking.
and lag of sensors are all inevitable, and these leads to
Through the integration of multiple sensors, there are several of the inherent errors face by sensor fusion.
certain advantages we can achieve, compared with just a Most sensors are not ideal, and there are deviations which
single input. The enhanced reliability, extended parameter may come into the information required. Some can be
coverage, improved resolution are all desirable in any assumed to be caused by random noise, which requires
system. While sensor fusion research has improved leaps signal processing to reduce the error. The other case is a
and bounds in recent years, certainly we are still far away systematic error which is correlated with time, and this can
from achieving the competence to mimic the human mind be improved through a defined filter if the error is known.
in analyzing different data simultaneously. Due to the
multiple sources and types of information being fed 2.2 Advantages of Multi-sensor Fusion
continuously, there are various problems that arises, such as In general, multi-sensor fusion data provides significant
data association, sensor uncertainty, and management of advantages as compared to using only a single source data.
data. In most cases, these are usually associated with the
The improvement of performance is summarize in four
inherent ambiguity of each sensors, with device noise and
general areas [1]:
also ambiguities in the environment being measured. A
Representation. Information obtained throughout the fusion
robust system of sensor fusion should be able to handle such process has an abstract level, or a granularity, higher than
uncertainties, and at the end, provide consistent results of the original individual input data set. This allows for a
the environment. richer semantic and higher resolution on the data compared
2. SENSORS, ADVANTAGES/PROBLEMS to that of each initial source of information.
Certainty. We expect the probability of the data to increase
Sensors are used to detect certain attributes or changes of after fusion process, increasing the confidence rate of the
the environment, and provide feedback to the system based data in use. The improved signal to noise ratio is also part
on its detection. Existing sensors that are available include of the reason of better confidence in the fused data. These
camera, rangefinder, sonar and ultrasonic. In many cases are associated with redundant information from group of
such as mobile devices, they may include accelerometers, sensors surveying the same environment. The reliability of
magnetometer, ambient air temperature sensors, pressure the system thus is improved as well in cases of sensor error
sensors, gyroscopes, and proximity sensors. The or failure.
classification of sensors is usually dependent on the purpose, Accuracy. If at first data is noisy or have errors, the fusion
and different criteria can be designated. Some methods process should try to reduce or eliminate noise and errors.
which are typically employed to define sensors are between Usually, the gain in certainty and the gain in accuracy are
active and passive sensors, absolute or relative, as well as correlated. The accuracy can be in the timing as well, from
the stimulus of various sensors.

978-1-5090-4657-7/17/$31.00 2017
c IEEE 3853
the parallel processing of different information from other sensors. The level of data may be different, and this
multiple sensors. has to be addressed in the process of fusion.
Completeness. Through bringing new information to the Time Scales. In different aspects, sensors may be measuring
current knowledge of the environment allows for a more the same environment at different rates. Another case is two
thorough view. If individual sensors only provide identical sensors measuring at different frequency due to
information that is independent of other sensors, bringing manufacturing defects. The arrival timing at the fusion node
them into a coherent space will give an overarching view of may also not coincide due to propagation delays in the
the whole. Usually, if the information is redundant and system. Especially for spatial distribution of sensors, with
concordant, the accuracy will improve. The discrimination variation in the data rate, real-time sensor fusion has to be
power of the information is also increased with more based on a precise time-scale setting to ensure all data are
comprehensive coverage from multiple sensors. synchronized properly. In cases where fusion algorithm
The numbers of sensors which is employed is also a factor requires a history of data, how fast the sensor is able to
in the cost analysis of whether a multi-sensor system is provide data is directly related to the validity of results.
better than a single sensor system [2]. A criterion has to be
set up to assess the reliability of the whole system. 3. ALGORITMS FOR SENSOR FUSION
However, as different applications require different Due to the various natures of fusion process, different
numbers and types of sensor, it is difficult to define an algorithms are engaged for different level of fusion. These
overarching optimal number of sensors for any given are usually probability theory, classification methods, and
system. artificial intelligence [9].
2.3 Possible Problems and Issues 3.1 Kalman Filtering
Certainly, sensor fusion comes with its own inherent The Kalman filter is an ideal statistical recursive data
problems. Several key issues have to be considered for processing algorithm which continuously calculates an
sensor fusion techniques [3, 4]: estimate of a continuous valued state based on periodic
Registration. Individual sensors have its own local observations of the state. It uses an explicit statistical model
reference frame from which it provides data. For fusion to of how x(t) changes over time and an explicit statistical
occur, the different data sets have to be converted into a model of how observations z(t) which are made are related
common reference frame, and aligned together. Calibration [10, 11].
error of individual sensors should be addressed during this The explicit description of process and observations lets
stage. This problem is critical in determining whether the many models of sensor to be easily incorporated in the
process of fusion is successful or not. algorithm. Not only so, we can constantly assess the role of
Uncertainty in Sensor data. Diverse formats of data may each sensor in the system.
possibly create noise and ambiguity in the fusion process. As every iteration requires almost the same effort, the
Competitive or conflicting data may thus be results from Kalman filter is well adapted for real-time usage.
such errors. The redundancy of the data from multiple We first define a model for the states to be estimated in the
sensors have to be engaged to reduce uncertainty, and standard space-time form:
learning to reject outliers if conflicting data is encountered. ‫ݔ‬ሶ ሺ‫ݐ‬ሻ ൌ ሺ‫ݐ‬ሻ‫ݔ‬ሺ‫ݐ‬ሻ ൅ ‫ܤ‬ሺ‫ݐ‬ሻ‫ݑ‬ሺ‫ݐ‬ሻ ൅ ݊ሺ‫ݐ‬ሻ
Incomplete, Inconsistent, Spurious data. Data is considered where x(t) is the state vector of interest, A(t) is the transition
to be incomplete if the observed data remains the same matrix, B(t) is the control matrix, u(t) is a known control
regardless of the number of interpretations. Some methods input. The observations equation defined in standard space-
to make data complete is by either collecting more data time model:
features, or through the usage of more sensors. Inconsistent ‫ݖ‬ሺ‫ݐ‬ሻ ൌ ‫ܪ‬ሺ‫ݐ‬ሻ‫ݔ‬ሺ‫ݐ‬ሻ ൅ ‫ݒ‬ሺ‫ݐ‬ሻ
sensors is defined to be two complete data sets but having where z(t) is the observation vector, H(t) is the
different interpretations. This is the consequences of bad measurement matrix. n(t) and v(t) are random zero-mean
sensor registration or sensors observing different things. If Gaussian variable describing uncertainty as the state
data contains features that is not related to the observed evolves, with covariance matrices Q(t) and R(t)
environment, it is defined to be spurious. Just like respectively.
uncertainty, the redundancy data have to be exploited to From this, the Kalman filter proceeds recursively in two
help in fusing the incomplete, inconsistent, and spurious stages, prediction and update [12].
data [5]. A prediction ‫ݔ‬ොሺ݇ȁ݇ െ ͳሻ of the state at time k is given as
Correspondence / Data Association [6, 7]. One aspect of ‫ݔ‬ොሺ݇ȁ݇ െ ͳሻ ൌ ‫ܣ‬ሺ݇ሻ‫ݔ‬ሺ݇ െ ͳȁ݇ െ ͳሻ ൅ ‫ܤ‬ሺ݇ሻ‫ݑ‬ሺ݇ሻ
sensor fusion is establishing whether the two tracks from The covariance P(k|k-1) is also computed as
each sensor represent the same object (Track-to track). This ܲሺ݇ȁ݇ െ ͳሻ ൌ ‫ܣ‬ሺ݇ሻܲሺ݇ െ ͳȁ݇ െ ͳሻ‫ ்ܣ‬ሺ݇ሻ ൅ ܳሺ݇ሻ
is required to know how the data features matches each At time k, and observation z(k) is made and the estimate
other from different sensors, and knowing whether there are ‫ݔ‬ොሺ݇ȁ݇ሻ is the update of state x(k). Together with the
data features that are outliers. The other forms of data updated state estimate covariance matrix P(k|k), are
association problem is measurement-to-track association, computed from the state prediction and observation by
which refers to the problem of recognizing from which ‫ݔ‬ොሺ݇ȁ݇ሻ ൌ ‫ݔ‬ොሺ݇ȁ݇ െ ͳሻ ൅ ‫ܭ‬ሺ݇ሻሾ‫ݖ‬ሺ݇ሻ െ ‫ܪ‬ሺ݇ሻ‫ݔ‬ොሺ݇ȁ݇ െ ͳሻሿ
target each measurement originates [8]. ܲሺ݇ȁ݇ሻ ൌ ሾ‫ ܫ‬െ ‫ܭ‬ሺ݇ሻ‫ܪ‬ሺ݇ሻሿܲሺ݇ȁ݇ െ ͳሻ
Granularity. The level of details from different sensors is K(k) is the Kalman gain matrix that is a measure of the
rarely similar. The data may be sparse or dense, relative to relative confidence of the past estimates and the latest

3854 2017 29th Chinese Control And Decision Conference (CCDC)


observation. The Kalman gain is chosen to minimize the a have to be mapped into a higher-dimensional feature space
posteriori state estimate covariance. using a function Ȱǣ Ը௠ ՜ ‫ܪ‬, and then linearly separated in
The main advantage of Kalman Filter is that it has high this space using kernel functions. Some frequently used
computational efficiency since entire sequence of old kernel functions include polynomial and Gaussian radial
observations is not reprocessed with every new basis function.
observations. All these are condensed through the SVM attempts to find the maximum margin is very suitable
information in the current state and error correlation matrix. in cases of path planning, where safety margin is vital.
However, it is restricted to linear system dynamics, and that Collision avoidance path planning and navigation,
the initial uncertainty which is Gaussian [13, 14]. In such simultaneous localization and mapping (SLAM) frequently
cases, extended Kalman Filter is employed. Using first- apply SVM [27, 28].
order Taylor series expansion, the system is linearize [15]. SVM is also used to compress information in sensor fusion
The EKF is a frequently employed methods for data fusion system that may have limited bandwidth, and large set of
in robotic applications today. However, the linearization data samples are not feasible for real-time processing [29].
can be unstable if the time intervals are not small enough, A two-layer SVM scheme was also purposed and
with a tradeoff of higher requirement of computations with significantly improves the results of a single SVM [30].
fine time intervals [16, 17]. In sensor network where each
3.3 Bayesian Inference Technique
sensor is able to process information and move to the target,
a distributed algorithm [18] can be used to estimate the Baye’s rule provides a means of combining observed data
average-consensus, and this is a form of distributed Kalman with past beliefs about the state of the environment. It
filtering (DKF) in scalable sensor fusion. Mobile sensor requires that the state of an object or environment described
network has better mobility and performance compared to as x, and an observation z, be determined as a joint
static network, making it of interest to take advantage of probability or joint probability distribution P(x,z) for
[19, 20]. Another algorithm that is gaining popularity is discrete and continuous variables respectively. Expanding
the unscented Kalman filter (UKF), which is based on a the joint probability by the chain-rule of conditional
relatively low complexity in approximating a known probabilities:
statistical distribution [9]. It determines the minimum set of ܲሺ‫ݔ‬ǡ ‫ݖ‬ሻ ൌ ܲሺ‫ݔ‬ȁ‫ݖ‬ሻܲሺ‫ݖ‬ሻ ൌ ܲሺ‫ݖ‬ȁ‫ݔ‬ሻܲሺ‫ݔ‬ሻ
points around the mean, which is enough to describe the And upon arriving in terms of P(x|z), we obtain Baye’s rule:
true mean and covariance completely. From there, it ௉ሺ‫ݖ‬ȁ‫ ݔ‬ሻ௉ሺ௫ሻ
ܲሺ‫ݔ‬ȁ‫ݖ‬ሻ ൌ
calculates the estimates without the need to linearize. This ௉ሺ௭ሻ
௟௜௞௘௟௜௛௢௢ௗή௣௥௜௢௥
makes the UKF simple to implement on a complex process ‫ ݎ݋݅ݎ݁ݐݏ݋݌‬ൌ 
௘௩௜ௗ௘௡௖௘
compared to the EKF. Also, the UKF can be employed in Bayesian inference is a statistical data fusion algorithm
parallel implementations. Recent research have being done based on Baye’s theorem, with a recursive predict-update
for robot navigation using the extended Kalman filter [21] process [31]. In sensor fusion however, the system state is
and the unscented Kalman filter [22, 23]. most of the time time-dependent, which means that the
3.2 Support Vector Machine (SVM) system state changes over time even though no new
observation has been taken. Not only so, but the prior
Support Vector Machine was proposed in 1963, and the changes with every new observations, and thus Baye’s rule
current standard in 1993 [24]. It is a learning model that have to be applied recursively. We can conclude that the
analyses data, and extract patterns for classification and prior is dependent on time, and on the history of
regression analysis. Taking a set of training examples, each observations taken beforehand.
belonging to one of two classes, SVM assign new examples The multi-sensor form of Baye’s rule requires conditional
into either category. It is thus a non-probabilistic binary independence,
linear classifier. The optimized hyperplane should ܲሺ‫ݖ‬ଵ ǡ ǥ ǡ ‫ݖ‬௡ ȁ‫ݔ‬ሻ ൌ ܲሺ‫ݖ‬ଵ ȁ‫ݔ‬ሻǤ Ǥ ܲሺ‫ݖ‬௡ ȁ‫ݔ‬ሻ ൌ ς௡௜ୀଵ ܲሺ‫ݖ‬௜ ȁ‫ݔ‬ሻ
minimize structural errors and maximize the margins So that
between the hyperplane and the closest points – the support ܲሺ‫ݔ‬ȁܼ ௡ ሻ ൌ ܲሺ‫ݔ‬ሻ ς௡௜ୀଵ ܲሺ‫ݖ‬௜ ȁ‫ݔ‬ሻ
vectors [25, 26] summary of SVM method is stated here. This states that the posterior probability of x, given all
Taking a binary classification problem in a m-dimensional observation of Zn, is proportional to the product of all the
feature space Ը௠ , consider a set of points ሼ‫ݔ‬௜ ǡ ‫ݕ‬௜ ሽே
௜ୀଵ which individual likelihoods from each sensor source. The
are the training data, where ‫ݔ‬௜ ‫ א‬Ը௠ , ݅ ൌ ͳǡʹǡ ǥ ǡ ܰ and recursive form of Baye’s rule is then given by,
‫ݕ‬௜ ‫ א‬ሼെͳǡ ൅ͳሽ are corresponding class labels. The objective ܲሺ‫ݖ‬௞ ȁ‫ݔ‬ሻܲሺ‫ݔ‬ȁܼ ௞ିଵ ሻ
of SVM is to try to construct a hyperplane to allocate any ܲሺ‫ݔ‬ȁܼ ௞ ሻ ൌ
ܲሺ‫ݖ‬௞ ȁܼ௞ିଵ ሻ
new point into either classes through the linear classifier
ܲሺ‫ݔ‬ȁܼ ௞ିଵ ሻ include a complete summary of all past
function
information at k-1 instant. At the next instant k, with next
݂ሺ‫ݔ‬௜ Ǣ ‫ݓ‬ǡ ܾሻ ൌ ‫ ݓۃ‬ή ‫ݔ‬௜ ‫ ۄ‬൅ ܾ
piece of information ܲሺ‫ݖ‬௞ ȁ‫ݔ‬ሻ, the previous posterior acts as
where w and b are the normal vector and the bias
the current prior information, and provide the new posterior
respectively. The labeled points are then sorted, with the
density. Thus, the computation is much less demanding.
support vectors lying on two hyperplanes,
‫ݕ‬௜ ሺ‫ ݓۃ‬ή ‫ݔ‬௜ ‫ ۄ‬൅ ܾሻ ൌ  േͳ 3.4 Sequential Monte Carlo methods (Particle filter)
which are parallel to the optimal linear separating
Particle Filters are a class of modern sequential Monte
hyperplane. The aim now is to minimize ԡ߱ԡଶ Ȁʹ to obtain
Carlo methods [32]. It is based on building a posterior
the maximum margin. However, if data is non-linear, they

2017 29th Chinese Control And Decision Conference (CCDC) 3855


density function using several random samples call ݉ଵǡଶ ሺ‫ܣ‬ሻ ൌ ሺ݉ଵ ْ  ݉ଶ ሻሺ‫ܣ‬ሻ ൌ
particles. The advantage of particle filtering is its ability to ଵ
 σ ݉ ሺ‫ܤ‬ሻ݉ଶ ሺ‫ܥ‬ሻ
represent arbitrary probability densities, when systems are ଵି௄ ஻‫ת‬஼ୀ஺ஷ‫ ׎‬ଵ

non-Gaussian or nonlinear. Also, the error in calculation is ݉ଵǡଶ ሺ‫׎‬ሻ ൌ Ͳ


usually an unknown or non-Gaussian, and a probability where K represents the amount of conflict between the two
density function is mandatory. sources sets, and is given as:
Particle filter works by approximating the probability of the ‫ ܭ‬ൌ  σ஻‫ת‬஼ୀ‫݉ ׎‬ଵ ሺ‫ܤ‬ሻ݉ଶ ሺ‫ܥ‬ሻ
state as weighted sum of random samples, which are Unlike Bayesian inference, D-S theory allows each source
predicted, with their weights updated from the likelihood of to contribute information in its own levels of detail. For
measurement. This is called sequential importance example, one sensor can give information to separate
sampling (SIS). A resampling step is introduced in newer individual entities, while another provide information to
iteration to prevent filter divergence, and this is done by separate classes of entities. We are able to represent partial
removing particles with the lowest weights, and creating knowledge, updated beliefs, together with a combination of
new particles at points with the highest weight [33]. This is evidence and to model the ambiguity explicitly [39].
called sequential importance resampling. Particle filter have To determine when to use Bayesian or Dempster-Shafer
being proven to be effective in distributed sensing method, one have to decide whether a higher level of
environment [34]. A number of different types of particle accuracy from the former is required, or the flexibility of
filter exist, and the performance of different one varies the latter method is preferred [40]. Studies have been done
when used for certain applications. The choice of to compare between Bayesian inference method and Shafer-
importance density is the important factor that determines Dempster method [41, 42]. A recent application presenting
the performance [35]. human-autonomy sensor fusion in object detection
compares the performance between Bayesian, Dempster-
3.5 Dempster-Shafer Theory of Evidence Shafer, Dynamic Dempster-Shafer fusion method [43].
The Dempster-Shafer (D-S) evidence theory was proposed Certainly, there are issues with the D-S Theory, like the
by Dempster [36] and later extended mathematically by complexity of computations [44] and also counterintuitive
Shafer [37]. D-S theory is based on two ideas: obtaining results from conflicting data [45]. Some common
degrees of belief for one question from subjective approaches is to use D-S theory with other algorithms to
probabilities for a related question, and using Dempster’s enhance the accuracy and speed [46, 47].
rule to combine the degrees of belief when they are based 3.6 Artificial Neural Networks (ANN)
on independent items of evidence [38].
The evidence theory operates on a frame of discernment, Θ. ANNs are mathematical models composed of nonlinear
Let 2Θ denotes the power set of all subsets of Θ, including computational elements (neurons), operating in parallel and
the null set Ø and Θ itself. For example, if Θ = {a,b}, then connected as a graph topography characterized by different
ʹ஀ ൌ  ሼ]ǡ ሼƒሽǡ ሼ„ሽǡ ȣሽ weighted links. ANNs have proven to be more powerful,
In D-S inference, each element is assign a basic probability and more adaptable method, compared to traditional linear
assignment, or non-linear analyses [48, 49]. The layers of processing
݉ǣ ʹ஀ ÆሾͲǡͳሿ neurons can be connected in different ways. The neurons
There are two properties, first the mass of the empty set is can be trained to learn behavior of any system, using sets of
0: training data and learning algorithms to tune the individual
weight of the links. Weights are altered to improve the
݉ሺ‫׎‬ሻ ൌ Ͳ
robustness of the system. Once the errors for the training
Remaining mass terms of the power set add up to 1:
data have being minimized, the ANNs can remember the
σ஺‫݉ ஀ك‬ሺ‫ܣ‬ሻ ൌ ͳ
functions, and be engaged in further estimations. The data
For any proposition A, m(A) expresses the proportion of
is closely linked with the processing. One major problems
available evidence that supports the claim that the actual
currently is determining the best topology for any given
system state belongs to A. The belief and plausibility
problem. Some factors which determines this are the
functions are then derevied from the value of m(A), where
problem itself, the prospective approach to the problem, and
the belief is the lower bound of the probability P(A), and
the neural network characteristics. Recent research in robot
the plausibility is the upper bound:
navigation have successfully used neural networks in sensor
ܾ݈݁ሺ‫ܣ‬ሻ ൑ ܲሺ‫ܣ‬ሻ ൑ ‫݈݌‬ሺ‫ܣ‬ሻ
fusion [50].
The belief of A is defined as the sum of all the masses of
subsets of the set of interest A: 3.7 Fuzzy Logic
ܾ݈݁ሺ‫ܣ‬ሻ ൌ  σ஻ȁ஻‫ك‬஺ ݉ሺ‫ܤ‬ሻ
Fuzzy logic is finding wide-spread popularity as a method
The plausibility of A is defined as the sum of all the masses to represent uncertainty in high-level fusion. Essentially, it
of sets B that intersect the set of interest A: is a type of multi-value logic that allows the uncertainty in
‫݈݌‬ሺ‫ܣ‬ሻ ൌ  σ஻ȁ஻‫ת‬஺ஷ‫݉ ׎‬ሺ‫ܤ‬ሻ multi-sensor fusion to be categorized in the inference
From Dempster’s rule of combination, evidence from process by assigning each proposition a degree of
sensors is fused. Taking two sources of information with membership from 0 to 1 [51]. Fuzzy sensor fusion approach
belief mass functions m1 and m2, the joint belief mass has shown a high degree of certainty and accuracy, although
function m1,2 is then calculated as follows: the tradeoff is the complex computations required [52].

3856 2017 29th Chinese Control And Decision Conference (CCDC)


In most application of sensor fusion, a combination of as these two produce richer and more accurate 3D
methods are used to exploit the advantages of artificial representation which helps detect and classify objects better
intelligence method and traditional method. [53] merges [69]. However, LiDAR sensors while provides a better field
neural network and linearly constrained least squares of coverage, it does not provide speed information, and
method, which is shown to be stable and fast. [54] is able to RADAR gives accurate speed data but is not effective in
take different information sources with different noise lanes with curves. Many of these are related to mobile
characteristics and achieve optimized results through the robotics, with path planning and obstacles avoidance [70]
use of fuzzy logic. [55] integrate Kalman filter with fuzzy being scaled up to facilitate the real world application. The
logic techniques, and is able to achieve the optimality of design of complementary sensors is essential to provide
Kalman Filter, and the competence of fuzzy systems to better 3D map [71], or allowing the system to recognize
handle inconsistent information. different bodies in the environment [72], and also the
scalability of electronic systems to ensure that no bottle-
4. APPLICATIONS / IMPLEMENTATION necking occur during real-time processing [73], with the
Multi-sensor fusion systems have already being applied to expected increases in information feed.
different problems, but there are areas of which research is For safety and collision avoidance, sensor fusion research
still being carried out, and being developed. Overlap may is being done to improve the quality of detection, especially
occur in the following cases, but this is a general attempt to in preventing false positive cases [74].
covering the board aspects.
4.3 Quadrotors and Drone
4.1 Internet of Things Drones and quadrotors are also an emerging field for
In the last decade, the Internet of Things (IoT) has attracted developing new technologies and methods to ensure the
attention from academia [56, 57] and industry, due to its safe operation as well as reliable maneuver [75]. The
potential to create a smart world where every objects are navigation system of such quadrotor usually consist of
connected to the Internet and communicate with each other three-axis gyroscope, three-axis accelerometer and
with minimum human intervention [58]. The IoT requires magnetometer in the navigation system, with a
large amounts of real-time data to offer materials for complementary sensor group of pressure altimeter,
analysis and action, and sensors are available everywhere, ultrasonic sensors and GPS [76]. Autonomous flight [77] is
from smart devices (smartphones, tablets), wearables one aspect of which new progress have allow quadrotor to
(smartwatches, camera glasses) and healthcare (RFIDs). work independently in places where human may be unable
Approaches to improve this allow IoT to work efficiently to reach. The cost of implementing fusion is relatively low,
[59, 60]. Sensor fusion helps to enable context awareness, a but still maintaining a satisfactory performance [78], allows
cornerstone for the IoT. By knowing the circumstances or for the wide availability of consumer level drones. Not only
facts that form an event, we can use this information to so, due to the robustness of sensor fusion, drones can hover
understand why a situation is as it is, and form suitable in one fixed position without the need of GPS, through the
action. With about 50 to 100 billion devices projected to usage of other sensors [79, 80]. It is important to show the
connect to the internet by 2020 [61], and able to generate reliability of fusion method, that operation will not be
data constantly, this present an enormous amount of data to compromised despite one sensor input missing.
present. Some areas which is expected to see applications
4.4 Computer Vision [81]
are building automation like smart energy consumption
control [62], power grid [63], environment, industrial [64] Computer vision, started off as trying to mimic the human
and consumer home automation [65]. With cars being vision, though using competing sensors [82]. As the
equipped with sensors, as well as camera feed on road, understanding of the complexity of perception developed,
information can be generated to keep track of traffic new sensors, like 3D cameras, have help to augment the
anywhere in the city [66], and this can be provided back to ability of computer vision. It had become an essential part
the users [67]. in many applications, for example medical imaging, vision
of intelligent robots [83], and nondestructive testing.
4.2 Automotive and Navigation In recent years, the need to improve the security of the
With cars becoming more sophisticated, developments ae general public, as well as public assets had grown. One big
focused on improving performance, safety, comfort, hurdle is the detection of concealed weapons underneath
environmentally friendliness and assistance to driver. In the person’s clothing. Several fusion methods have being
area of autonomous driving, various sensors are being worked on, for example, multiple images with different
featured like GPS (Global Positioning System), LiDAR and exposure, together with infrared images [84], and combined
ultrasound. Many of these are used to create object detection for automatic bag screening at venues like
representation of both the car and its surrounding [68], and stadium or museums [85].
these data can be fused to provide a complete view of the
4.5 Virtual reality / Augmented reality
driving condition. With the reliance on so many types of
sensors, a multi-level fusion process is required, where low A recent development, virtual reality (VR) is an emerging
level sensor fusion process the massive amount of input technology that is attracting attention from consumers as an
data and high-level fusion process provides the real-time entertainment or educational tool. Some of the current
decision. With the recent development of RADAR and models available are HTC Vive and the Oculus Rift. One of
LiDAR, there is now a smaller demand of on-board camera, the key challenges of virtual environment is the tracking of

2017 29th Chinese Control And Decision Conference (CCDC) 3857


head movement. As a user change his viewpoints, the carried out. New application areas like Internet of Things,
virtual elements must keep their alignment with the automotive and healthcare applications show benefits
observed 3D position and the orientation to real world when sensor fusion is applied, and there are still a wide
objects. In addition to the accuracy, the ability to provide range of potential applications that is unable to be covered
stable motions is vital as well. The last challenge is to fully. Certainly there are still areas of development and
reduce the latency, defined here as the time between head research that can help to further advance current level of
movement and producing corresponding images to the knowledge. Algorithm fusion is still being debated, to try
user’s retina. This had being an early problem which causes to focus on the advantages each method have, and using
VR simulator sickness. A single gyroscope does not give new methods to cover up the weakness of other. New
information about the user’s location, while approaches to combine the different level of sensor fusion
accelerometers’ reading tends to be noisy, and yaw reading and different approaches have to be developed, and a
cannot be read. A magnetometer can act like a compass, general framework to assess different sensor fusion
allowing an orientation estimate, however, this is easily technique will be essential to benchmark clearly the
affected by any ferromagnetic metals [86]. The current different techniques, and to allow us to determine
method of sensor fusion uses a weighted filter to determine precisely the constraints required for certain system. The
the information to take from the different sensors, taking the accuracy, computational speed and cost of sensor fusion
long term accuracy of the accelerometer, while using the are the three basic requirements, but in most cases today,
gyroscope to do reduction of the noise signal in the short only two of them are usually fulfilled for every method.
term [87]. Predictive tracking methods have also been
implemented from the angular speed of the gyroscope, to REFERENCES
reduce latency to a rate of 30ms.
[1] H. B. Mitchell, Multi-sensor data fusion: An introduction. Springer,
4.6 Healthcare 2007.
[2] J. K. Hackett and M. Shah, "Multi-sensor fusion: A perspective,"
With aging population becoming a common trend in well Proceedings, International Conference on Robotics and
Automation, pp. 1324-1330, 1990.
developed country, it is important to have ways to monitor [3] B. Khaleghi, A. Khamis, F. O. Karray, and S. N. Razavi,
or track their health condition without the need to have "Multisensor data fusion: A review of the state-of-the-art,"
people monitoring them around the clock [88]. Fall Information Fusion, vol. 14, no. 1, pp. 28-44, 2013.
detection is an area being researched on, and that is [4] R. Joshi and A. C. Sanderson, Multisensor fusion: A minimal
representation framework. World Scientific, 1999.
especially helpful for elderly who live alone, with no [5] R. C. Luo and M. G. Kay, "A tutorial on multisensor integration and
supervision [89]. Not only does sensor fusion method fusion," in Industrial Electronics Society, 1990. IECON'90., 16th
benefits the elderly, we are able to monitor the development Annual Conference of IEEE, 1990, pp. 707-722: IEEE.
of infant motor functions, with new ability to assessment [6] F. Castanedo, "A review of data fusion techniques," Scientific
World Journal, vol. 2013, p. 704504, 2013.
body postures in infants [90]. General research using body [7] S. S. Blackman, "Theoretical approaches to data association and
sensors and wireless sensor networks are common, with fusion," Orlando Technical Symposium, pp. 50-55, 1988.
tracking [91] and identification [92] of human, tracking [8] D. Smith and S. Singh, "Approaches to multisensor data fusion in
mental state of patients and attempting to classify target tracking: A survey," IEEE Transactions on Knowledge and
Data Engineering, vol. 18, no. 12, pp. 1696-1710, Dec 2006.
individual’s state of the mind by fusing data from various [9] R. C. Luo, C. C. Chang, and C. C. Lai, "Multisensor fusion and
physiological sensors, for example, heart rate, respiration integration: Theories, applications, and its perspectives," IEEE
rate, carbon dioxide and oxygen level is being worked on Sensors Journal, vol. 11, no. 12, pp. 3122-3138, Dec 2011.
[93]. Some of the main problems are software side, like [10] R. E. Kalman, "A new approach to linear filtering and prediciton
problems," Transactions of the ASME-Journal of Basic
reliability of measurement, and network status to prevent Engineering, vol. 82, no. 1, pp. 35-45, 1960.
false positive from healthcare unit [94]. [11] J. Z. Sasiadek, "Sensor fusion," Annual Reviews in Control, vol. 26,
pp. 203-228, 2002.
4.7 Micro-scale sensors fusion [12] H. Durrant-Whyte and T. C. Henderson, "Multisensor data fusion,"
in Springer Handbook of Robotics, B. Siciliano and O. Khatib, Eds.:
Wearable electronics is a big opportunity for sensors, with Springer International Publishing, 2016, pp. 867-896.
them able to track user’s activity, healthcare, and sports [13] S. Julier and J. Uhlmann, "A non-divergent estimation algorithm in
applications. The microelectromechanical systems the presence of unknown correlations,," American Control
(MEMS) is the technology which enables them. They are Conference, 1997. Proceedings of the 1997, vol. 4, pp. 2369-2373,
Jun 1997.
now everywhere, from tablets to smartwatch to [14] J. Uhlmann, "Covariance consistency methods for fault-tolerant
smartphones. Besides MEMS, System on Chip (SoC) distributed data fusion," Information Fusion, vol. 4, no. 3, pp. 201-
solutions are getting more common with the need to 215, Sep 2003.
incorporate multiple sensors on a single hardware platform. [15] D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello,
"Bayesian filtering for location estimation," IEEE Pervasive
To achieve this, the miniaturization of sensors is an active Computing, vol. 2, no. 3, pp. 24-33, Jul-Sep 2003.
research area, as the number of sensors in a system will keep [16] S. Julier, J. Uhlmann, and H. F. Durrant-Whyte, "A new method for
increasing. the nonlinear transformation of means and covariances in filters and
estimators," IEEE Transactions on Automatic Control, vol. 45, no.
5. CONCLUSION 3, pp. 477-482, Mar 2000.
[17] S. Julier and J. Uhlmann, "A new extension of the kalman filter to
This paper provide a review of sensor fusion theories, nonlinear systems," Proceedings of the Adaptive Systems for Signal
from the models of different application of sensor fusion, Processing, Communications, and Control Symposium, pp. 153-
158, 2000.
to some of the common algorithms being used to enable [18] D. P. Spanos, R. Olfati-Saber, and R. M. Murray, "Approximate
sensor fusion, as well as recent researches that is being distributed Kalman filtering in sensor networks with quantifiable

3858 2017 29th Chinese Control And Decision Conference (CCDC)


performance," in Information Processing in Sensor Networks, 2005. Conference of the IEEE Engineering in Medicine and Biology
IPSN 2005. Fourth International Symposium on, 2005, pp. 133-139: Society vol. 3, pp. 1387-1390, Nov 1997.
Ieee. [41] Y. Cheng and R. L. Kashyap, "Comparision of bayesian and
[19] H. Su, X. Chen, M. Z. Q. Chen, and L. Wang, "Distributed dempster's rules in evidence combination," in Maximum-Entropy
estimation and control for mobile sensor networks with coupling and Bayesian Methods in Science and Engineering: Springer
delays," ISA transactions, vol. 64, pp. 141-150, 2016. Netherlands, 1988, pp. 427-433.
[20] H. Su, Z. Li, and M. Z. Q. Chen, "Distributed estimation and control [42] B. R. Cobb and P. P. Shenoy, "A comparison of bayesian and belief
for two-target tracking mobile sensor networks," Journal of the function reasoning," Information Systems Frontiers, vol. 5, no. 4,
Franklin Institute, 2017. pp. 345-358, Dec 2003.
[21] S. Lynen, M. W. Achtelik, S. Weiss, M. Chli, and R. Siegwart, "A [43] R. M. Robinson, H. Lee, M. J. McCourt, A. R. Marathe, H. Kwon,
robust and modular multi-sensor fusion approach applied to mav C. Ton, and W. D. Nothwang, "Human-autonomy sensor fusion for
navigation," International Conference on Intelligent Robots and rapid object detection," Intelligent Robots and Systems (IROS),
Systems (IROS), pp. 3923-3929, Nov 2013. 2015 IEEE/RSJ International Conference on, pp. 305-312, 2015.
[22] I. Ashokaraj, A. Tsourdos, P. Silson, and B. A. White, "Sensor based [44] J. A. Barnett, "Computational methods for a mathematical theory of
robot localisation and navigation: Using interval analysis and evidence," Proceedings of the 7th international joint conference on
unscented kalman filter," Proceedings of International Conference Artificial Intelligence, vol. 2, pp. 868-875, 1981.
on Intelligent Robots and System, vol. 1, pp. 7-12, 2004. [45] L. A. Zadeh, "Review of "a mathematics theory of evidence"," AI
[23] M. L. Anjum, J. Park, W. Hwang, H. I. Kwon, J. H. Kim, C. Lee, K. Magazine, vol. 5, no. 3, pp. 81-83, 1984.
S. Kim, and D. Cho, "Sensor data fusion using unscented kalman [46] T. Denoeux, "A neural network classifier based on Dempster-Shafer
filter for accurate localization of mobile robots," International theory," Systems, Man and Cybernetics, Part A: Systems and
Conference on Control Automation and Systems, pp. 947-952, OCt Humans, IEEE Transactions, vol. 30, no. 2, pp. 131-150, 2000.
2010. [47] Y. Zhan, H. Leung, K. C. Kwak, and H. Yoon, "Automated speaker
[24] C. Cortes and V. Vapnik, "Support-vector networks," Machine recognition for home service robots using genetic algorithm and
Learning, vol. 20, no. 3, pp. 273-297, Sep 1995. Dempster–Shafer fusion technique," Instrumentation and
[25] C. J. C. Burges, "A tutorial on support vector machines for pattern Measurement, IEEE Transactions on, vol. 58, no. 9, pp. 3058-3068,
recognition," Data Mining and Knowledge Discovery, vol. 2, no. 2, 2009.
pp. 121-167, Jun 1998. [48] D. Jiang, X. Yang, N. Clinton, and N. Wang, "An artificial neural
[26] V. Vapnik, Statistical learning theory. Wiley, 1998. network model for estimating crop yields using remotely sensed
[27] K. Tanaka, K. Yamano, E. Kondo, and Y. Kimuro, "A vision system information," International Journal of Remote Sensing, vol. 25, no.
for detecting mobile robots in office environments," IEEE 9, pp. 1723-1732, May 2004.
International Conference on Robotics and Automation, vol. 3, pp. [49] J. Dong, D. Zhuang, Y. Huang, and J. Fu, "Advances in multi-sensor
2279-2284, 2004. data fusion: Algorithms and applications," Sensors (Basel), vol. 9,
[28] J. Shen and H. Hu, "SVM based SLAM algorithm for autonomous no. 10, pp. 7771-84, 2009.
mobile robots," International Conference on Mechatronics and [50] H. Guanshan, "Neural network applications in sensor fusion for a
Automation, pp. 337-342, Aug 2007. mobile robot motion," WASE International Conference on
[29] S. Challa, M. Palaniswami, and A. Shilton, "Distributed data fusion Information Engineering (ICIE, vol. 1, pp. 46-49, Aug 2010.
using support vector machines," Proceedings of the Fifth [51] R. E. Gibson , D. L. Hall, and J. A. Stover, "An autonomous fuzzy
International Conference on Information Fusion, vol. 2, pp. 881- logic architecture for multisensor data fusion," International
885, July 2002. Conference on Multisensor Fusion and Integration for Intelligent
[30] B. Waske and J. A. Benediktsson, "Fusion of support vector Systems, pp. 143-150, 1994.
machines for classification of multisensor data," IEEE Transactions [52] M. A. A. Akhoundi and E. Valavi, "Multi-sensor fuzzy data fusion
on Geoscience and Remote Sensing, vol. 45, no. 12, pp. 3858-3866, using sensors with different characteristics," arXiv preprint
Dec 2007. arXiv:1010.6096, 2010.
[31] W. A. Abdulhafiz and A. Khamis, "Bayesian approach to [53] Y. Xia, H. Leung, and E. Bosse, "Neural data fusion algorithms
multisensor data fusion with pre-and post-filtering," 10th IEEE based on a linearly constrained least square method," IEEE Trans
International Conference on Networking, Sensing and Control, pp. Neural Netw, vol. 13, no. 2, pp. 320-9, 2002.
373-378, 2013. [54] K. Goebel and W. Yan, "Hybrid data fusion for correction of sensor
[32] D. Crisan and A. Doucet, "A survey of convergence results on drift faults," IMACS Multiconference on Computational
particle filtering methods for practitioners," IEEE Transactions on Engineering in Systems Applications, vol. 1, pp. 456-462, Oct 2006.
Signal Processing, vol. 50, no. 3, pp. 736-746, Mar 2002. [55] P. J. Escamilla-Ambrosio and N. Mort, "Hybrid kalman filter-fuzzy
[33] N. J. Gordon, D. J. Salmond, and A. F. Smith, "Novel approach to logic adaptive multisensor data fusion architectures," Proceedings
nonlinear/non-Gaussian Bayesian state estimation," Radar and 42nd IEEE Conference on Decision and Control, pp. 5215-5220,
Signal Processing, Proceedings F, vol. 140, no. 2, pp. 107-113, Dec 2003.
1993. [56] K. S. Yeo, M. C. Chian, T. C. W. Ng, and D. A. Tuan, "Internet of
[34] J. Liu, M. Chu, J. Liu, J. Reich, and F. Zhao, "Distributed state things: Trends, challenges and applications," 2014 14th
representation for tracking problems in sensor networks," Third International Symposium on Integrated Circuits (Isic), pp. 568-571,
International Symposium on Information Processing in Sensor 2014.
Networks, pp. 234-242. [57] D. Singh, G. Tripathi, and A. J. Jara, "A survey of internet-of-things:
[35] M. S. Arulampalam, S. Maskell, N. Gordon, and T. Clapp, "A Future vision, architecture, challenges and services," 2014 IEEE
tutorial on particle filters for online nonlinear/non-gaussian World Forum on Internet of Things (Wf-Iot), pp. 287-292, 2014.
bayesian tracking," IEEE Transactions on Signal Processing, vol. [58] H. Sundmaeker, P. Guillemin, P. Friess, and S. Woelfflé, "Vision
50, no. 2, pp. 174-188, Feb 2002. and challenges for realizing the internet of things," European
[36] A. P. Dempster, "A generalization of bayesian inference," Journal Commission Information Society and Media2010.
of the Royal Statistical Society, no. Series B (Methodlogical), pp. [59] F. H. Bijarbooneh, W. Du, E. C. H. Ngai, X. M. Fu, and J. C. Liu,
205-247, 1968. "Cloud-assisted data fusion and sensor selection for internet of
[37] G. Shafer, A Mathematical Theory of Evidence. Princeton things," IEEE Internet of Things Journal, vol. 3, no. 3, pp. 257-268,
University Press, 1976. Jun 2016.
[38] R. C. Luo and C. C. Chang, "Multisensor fusion and integration: A [60] S. Din, H. Ghayvat, A. Paul, A. Ahmad, M. M. Rathore, and I. Shafi,
review on approaches and its applications in mechatronics," IEEE "An architecture to analyze big data in the internet of things," in
Transactions on Industrial Informatics, vol. 8, no. 1, pp. 49-60, Feb Sensing Technology (ICST), 2015 9th International Conference on,
2012. 2015, pp. 677-682.
[39] G. Provan, "The validity of Dempster-Shafer belief functions," [61] A. Zaslavsky, C. Perera, and D. Georgakopoulos, "Sensing as a
International Journal of Approximate Reasoning, vol. 6, no. 3, pp. service and big data," arXiv preprint arXiv:1301.0159, 2013.
389-399, May 1992. [62] K. Akkaya, I. Guvenc, R. Aygun, N. Pala, and A. Kadri, "IoT-based
[40] B. R. Bracio, W. Horn, and D. P. F. Moller, "Sensor fusion in occupancy monitoring techniques for energy-efficient smart
biomedical systems," Proceedings of the 19th Annual International buildings," 2015 IEEE Wireless Communications and Networking
Conference Workshops (Wcncw), pp. 58-63, 2015.

2017 29th Chinese Control And Decision Conference (CCDC) 3859


[63] N. Ouerhani, N. Pazos, M. Aeberli, and M. Muller, "IoT-based recognition," 2015 IEEE/Rsj International Conference on
dynamic street light control for smart cities use cases," Networks, Intelligent Robots and Systems (Iros), pp. 681-687, 2015.
Computers and Communications (ISNCC), 2016 International [84] E. M. Upadhyay and N. K. Rana, "Exposure fusion for concealed
Symposium on, pp. 1-5, 2016. weapon detection," 2014 2nd International Conference on Devices,
[64] F. Kirsch, R. Miesen, and M. Vossiek, "Precise local-positioning for Circuits and Systems (Icdcs), pp. 1-6, Mar 2014.
autonomous situation awareness in the internet of things," 2014 [85] A. M. Sagi-Dolev, "Multi-threat detection system," U.S. Patent
IEEE Mtt-S International Microwave Symposium (Ims), pp. 1-4, 8171810, 2012.
2014. [86] D. Gebre-Egziabher, G. H. Elkaim, J. D. Powel, and B. W.
[65] C.-L. Wu, Y. Xie, S. K. Pradhan, L.-C. Fu, and Y.-C. Zeng, Parkinson, "Calibration of strapdown magnetometers in magnetic
"Unsupervised context discovery based on hierarchical fusion of field domain," Journal of Aerospace Engineering, vol. 19, no. 2,
heterogeneous features in real smart living environments," pp. 87-102, Apr 2006.
Automation Science and Engineering (CASE), 2016 IEEE [87] J. Favre, B. M. Jolles, O. Siegrist, and K. Aminian, "Quaternion-
International Conference on, pp. 1106-1111, 2016. based fusion of gyroscopes and accelerometers to improve 3d angle
[66] S. K. Datta, R. P. F. Da Costa, J. Hyrri, and C. Bonnet, "Integrating measurement," Electronics Letters, vol. 42, no. 11, pp. 612-614,
connected vehicles in internet of things ecosystems: Challenges and May 25 2006.
solutions," World of Wireless, Mobile and Multimedia Networks [88] H. Medjahed, D. Istrate, J. Boudy, J. L. Baldinger, and B. Dorizzi,
(WoWMoM), 2016 IEEE 17th International Symposium on A, pp. 1- "A pervasive multi-sensor data fusion for smart home healthcare
6, 2016. monitoring," IEEE International Conference on Fuzzy Systems
[67] S. Wildstrom. (2012). Better living through big data. Available: (Fuzz 2011), pp. 1466-1473, Jun 2011.
http://newsroom.cisco.com/feature/778800/Better [89] G. Koshmak, A. Loutfi, and M. Linden, "Challenges and issues in
[68] P. Bonnifait, P. Bouron, P. Crubillé, and D. Meizel, "Data fusion of multisensor fusion approach for fall detection: Review paper,"
four ABS sensors and GPS for an enhanced localization of car-like Journal of Sensors 2016, 2015.
vehicles.," Robotics and Automation, 2001. Proceedings 2001 [90] A. Rihar, M. Mihelj, J. Pašič, J. Kolar, and M. Munih, "Using
ICRA. IEEE International Conference on, pp. 1597-1602, 2001. sensory data fusion methods for infant body posture assessment.,"
[69] F. Mujica, "Scalable electronics driving autonomous vehicle Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ
technologies," Texas Instruments2014. International Conference on, pp. 292-297, 2015.
[70] I. Ulrich and J. Borenstein, "VFH^*: Local obstacle avoidance with [91] S. Knoop, S. Vacek, and R. Dillmann, "Sensor fusion for 3D human
look-ahead verification.," Robotics and Automation, 2000. body tracking with an articulated 3d body model.," Robotics and
Proceedings. ICRA '00. IEEE International Conference on, pp. Automation, 2006. ICRA 2006. Proceedings 2006 IEEE
2505-2511, 2000. International Conference on, pp. 1686-1691, 2006.
[71] M. Renato, E. Fernandez-Moral, and P. Rives, "Dense accurate [92] M. T. Yang and S. Y. Huang, "Appearance-based multimodal
urban mapping from spherical rgb-d images.," Intelligent Robots human tracking and identification for healthcare in the digital
and Systems (IROS), 2015 IEEE/RSJ International Conference on, home," Sensors (Basel), vol. 14, no. 8, pp. 14253-77, Aug 05 2014.
pp. 6259-6264, 2015. [93] S. Begum, S. Barua, and M. U. Ahmed, "Physiological sensor
[72] M. Caterina, H. H. Bülthoff, and P. Stegagno, "Autonomous signals classification for healthcare using sensor data fusion and
vegetation identification for outdoor aerial navigation," Intelligent case-based reasoning," Sensors (Basel), vol. 14, no. 7, pp. 11770-
Robots and Systems (IROS), 2015 IEEE/RSJ International 85, Jul 03 2014.
Conference on, pp. 3105-3110, 2015. [94] H. Lee, K. Park, B. Lee, J. Choi, and R. Elmasri, "Issues in data
[73] E. Cardarelli, L. Sabattini, C. Secchi, and C. Fantuzzi, "Cloud fusion for healthcare monitoring," Proceedings of the 1st
robotics paradigm for enhanced navigation of autonomous vehicles international conference on Pervasive Technologies Related to
in real world industrial applications," 2015 IEEE/Rsj International Assistive Environments 2008.
Conference on Intelligent Robots and Systems (Iros), pp. 4518-
4523, 2015.
[74] A. Westenberger, M. Muntzinger, M. Gabb, M. Fritzsche, and K.
Dietmayer, "Time-to-collision estimation in automotive multi-
sensor fusion with delayed measurements," Advanced
Microsystems for Automotive Applications, pp. 13-20, 2013.
[75] S. Roelofsen, D. Gillet, and A. Martinoli, "Reciprocal collision
avoidance for quadrotors using on-board visual detection," 2015
IEEE/Rsj International Conference on Intelligent Robots and
Systems (Iros), pp. 4810-4817, 2015.
[76] X. J. Wei, "Autonomous control system for the quadrotor unmanned
aerial vehicle," 2016 13th International Conference on Ubiquitous
Robots and Ambient Intelligence (Urai), pp. 796-799, 2016.
[77] M. Tailanian, S. Paternain, R. Rosa, and R. Canetti, "Design and
implementation of sensor data fusion for an autonomous quadrotor,"
2014 IEEE International Instrumentation and Measurement
Technology Conference (I2mtc) Proceedings, pp. 1431-1436, 2014.
[78] A.-L. Chan, S.-L. Tan, and C.-L. Kwek, "Sensor data fusion for
attitude stabilization in a low cost quadrotor system," Consumer
Electronics (ISCE), 2011 IEEE 15th International Symposium on,
pp. 34-39, 2011.
[79] Y. Ling, T. Liu, and S. Shen, "Aggressive quadrotor flight using
dense visual-inertial fusion," Robotics and Automation (ICRA),
2016 IEEE International Conference on, pp. 1499-1506, 2016.
[80] W. Zheng, J. Wang, and Z. F. Wang, "Multi-sensor fusion based
real-time hovering for a quadrotor without GPS in assigned
position," Proceedings of the 28th Chinese Control and Decision
Conference (2016 Ccdc), pp. 3605-3610, 2016.
[81] A. Yilmaz, "Sensor fusion in computer vision," Urban Remote
Sensing Joint Event, pp. 1-5, 2007.
[82] H. Longuet-Higgins, "A computer algorithm for reconstructing a
scene from two projections," Readings in Computer Vision: Issues,
Problems, Principles, and Paradigms, pp. 61-62, 1987.
[83] A. Eitel, J. T. Springenberg, L. Spinello, M. Riedmiller, and W.
Burgard, "Multimodal deep learning for robust RGB-d object

3860 2017 29th Chinese Control And Decision Conference (CCDC)

You might also like