Science topic

Sensor Fusion - Science topic

Explore the latest questions and answers in Sensor Fusion, and find Sensor Fusion experts.
Questions related to Sensor Fusion
  • asked a question related to Sensor Fusion
Question
3 answers
I want to know that what is the difference between sensor data fusion and sensor data stitching. For example, we have two homogeneous sensor of two autonomous driving systems ( lets say it is "camera sensor") . So I want to combine the camera sensor data for better accuracy and better precision. But I am confuse to use the term "Stitching" and "fusion".
Which term is more suitable?
What are the key differences between these two terms in autonomous driving systems domain ?
Relevant answer
Answer
Sensor data fusion is used to take advantage of multiple sensors, generally based on different modality, to accomplish a goal better than any individual sensor would. For example, SLAM algorithm usually rely on sensor data fusion from camera, IMU and sometimes other inputs (lidar, ultrasound, etc.) in order to better localize and model the environment around an vehicle (or UAV, etc.)
The term sensor stitching is more used to extend the spatial range of a single modality with multiple sensor. The best example is using multiple photos with overlapping areas to create a bigger panoramic image.
I hope these two examples help you better understand the distinction between these related concepts.
  • asked a question related to Sensor Fusion
Question
3 answers
Hello,
I have several sensors which are interfaced with the help of ROS and are synchronized with the ROS time(ROS1). The sensors and their nodes are fully functional. Each sensor does some processing ,after it senses a detection in it's environment, before eventually timestamping this data in ROS. Since the sensors are of different kinds, and have their own processing before eventually timestamping it's data in ROS, there is an expected delay between the detection and the timestamping and also a delay is expected between the different sensors.
I am interested in the delay that takes place between the sensor detecting an event and eventually timestamping this data in ROS. The image shows the different processing for each sensor that takes place before timestamping.
When searching for this specific problem not much could be found, so any advice would be appreciated.
  • asked a question related to Sensor Fusion
Question
3 answers
What is the position error limit for GPS aided INS sensor fusion for small rockets?
Relevant answer
  • asked a question related to Sensor Fusion
Question
5 answers
For GPS - INS fusion how to find the rotation matrix from body to navigation frame and how the coriolisis and gravity will effect the navigation purturbation equations?
Relevant answer
Answer
Dear Varsha:
As Nabil answered, Farrell's Aided Navigation is a good book. Other good books are:
Biezad, D. J., Integrated Navigation and Guidance Systems, AIAA Education Series, 1999
Rogers, R. M., Applied Math in Integrated Navigation Systems, 3rd ed, AIAA Education Series, 2007
Grewal. Weill and Andrews, GPS, Inertial Navigation, and Integration, 3rd ed,
John Wiley 2013
Farrell, James L., GNSS Aided Navigation and Tracking, American Literary Press 2007
Farrell and Barth, The GPS and Inertial Navigation, 1999
For further correspondence with me, you may write to me at IIT Indore,
Also, you may see Sreeja, Professor, in your college, in Electrical Engineering.
I hope all this helps.
Hari Hablani
  • asked a question related to Sensor Fusion
Question
7 answers
I have a very limited knowledge  on control systems. Now my research field is on quadcopters and I want a comprehensive book in this field that can fill all my knowledge gaps on quadcopter sensor fusion techniques and also modeling,system identification and control methods of a quad-copter. 
Relevant answer
Answer
Multirotor Aerial Vehicles: Modeling, Estimation, and Control of Quadrotor
Survey of Advances in Guidance, Navigation, and Control of Unmanned Rotorcraft Systems
  • asked a question related to Sensor Fusion
Question
1 answer
Is there any clear resources for Inertial measurement unit sensor fusion?how to calculate rotation , positions with Arduino in Euler and quaternions ?I read some paper and googled these term but stil not find a stright forward explination with arduino examples
Relevant answer
Answer
Maybe the algorithms in this Excel sheet is good for you: http://jerome.jouvie.free.fr/opengl-tutorials/Quaternion.xls. It seem easy working with quaternion to know orientation and after obtaining Euler angles that has an anatomical meaning (in case you are using your application for biomechanics).
Regards
  • asked a question related to Sensor Fusion
Question
5 answers
In our studies we are using different sensors
We would like to hear from researchers and industry experts about sensor data dashboard, storage platforms and analysis tools.
What are the technologies and tools used to store and visualize real time sensor data? The answers can be in different application areas from automative and manufacturing to health domains. What are your recommendations for sensor data capturing?
Relevant answer
Answer
Thanks for answers and comments Jerimiah Westly Dickey , C K Gomathy , Ankush Rai , Albrecht Michler
  • asked a question related to Sensor Fusion
Question
2 answers
I'm facing the problem of model calibration for a larg-scale manipulator with a half-spherical workspace of around 40m diameter.
The typical approach would be computer vision, which is difficult in this case due to: i) larg scale and therefore huge distance to cameras, ii) only possible outdoors, where the sunlight might be a problem. iii) calibration of 5-6 cameras takes some time, the measurement should be finished in one day.
So my ideas are the following: i) use triangulation of radio frequency signals (e.g. RF position sensorii) use extra IMUs iii) develop a sensor fusion scheme, like kalman filter
I want to ask you for alternative approaches, expected accuracy of RF position sensors without sensor fusion, problems with RF reflections (the manipulator is constructed from steel) or any other input.
thanks in advance!
Relevant answer
  • asked a question related to Sensor Fusion
Question
7 answers
Hello, well, I want to get the linear and angular velocity of a vehicle based on the data of IMU and GPS. all the exemples I saw so far in the internet do a sensor fusion using Kalman filter to get the position not the velocity. Also we can get the transformation matrices between two set of points from the roll, pitch and Yaw angles, but I wonder if I can get the linear velocity taken t0 position as [0,0,0] and then devided the distance between two points over the timestamp. Actually the probleme here is the linear velocity because the angular velocity is already provided from the gyro. Pease any help will be appreciated.
Relevant answer
Answer
Sensor fusion - GPS+ IMU by Isaac Skog
  • asked a question related to Sensor Fusion
Question
6 answers
Hi everyone.
There's a MEMS sensor which has an accelerometer giving tri-axial angular velocity and linear acceleration. I wanted to measure the total kinetic energy which has 2 parts: translational and rotational part. K=0.5*m*V^2+0.5*Iw^2. I have the omegas (w) so I can measure the rotating part but I need to estimate the Eulerian framework velocities to get the translating part of the kinetic energy. I think I need to convert the Lagrangian acceleration values to Eulerian framework then with a filter (or an integral) Eulerain velocity can be estimated.
I'd be appreciate it if anyone could help me with the route to estimating velocity (Eulerian) to have the translational part for Kinetic energy.
Thanks
Relevant answer
Answer
Dear George Dishman Thanks for consideration.
but this lecture couldn't help since it's about introducing the Lagrangian and Eulerian derivatives. the thing is I need to change the reference frame from body reference to inertial. there are some methods like Quaternion, Euler angles,...
but I need an expert view on it if there's a need to use any filters and how to handle drift issue, noise and these stuff
thanks
  • asked a question related to Sensor Fusion
Question
3 answers
What are the latest developments in measring Physical exertion? In terms of multimodels sensor fusion, algorithms, wearables, applications.
Relevant answer
Answer
Heart Rate is also heavily explored.
I am hoping to look at multimodel multichannel research attempts
  • asked a question related to Sensor Fusion
Question
9 answers
Some of the literature I’ve read describes the Kalman Filter as an excellent sensor fusion or data fusion algorithm. Because of this, I wonder (1) if it’s possible to implement a Kalman Filter with a single sensor’s data--every implementation I've read uses more than one sensor--and (2) if possible, is Kalman Filtering truly an optimal algorithm for this system?
Specifically, my sensor measures just position, p, at designated time intervals. I then use the distance traveled divided by the time interval to calculate velocity, v, at the next time step. (Velocity at first timestep is assumed to be zero.) The state, x, is defined in terms of p and v.
However, given that I merely did manipulation with the first data set, I’m not convinced this is "data fusion" when all data is derived from the same source. Nor am I sure if the KF could do much more than subdue measurement noise in this case.
Relevant answer
Answer
The human sensory system: view, hearing, smell, taste, touch, is a paradigm for data fusion. In strict sense, for fusion you need at least two sensors for essentially different signals,
About the Kalman filter number of sensors needed, you should first have the models (system and observations models), taking into account their Controllabilty and Observability; This reference tackles this point:
Controllability and Observability: Tools for Kalman Filter Design;
B Southallz y , B F Buxtony and J A Marchantz
London, London WC1E 6BT, UK
  • asked a question related to Sensor Fusion
Question
2 answers
Commercial, industrial, and military vehicles are required to safely navigate predetermine paths in degraded visual environments (i.e., fog, snow, heavy rain, dust, ect.). Various techniques based on sensor fusion (i.e., CCD cameras, RADAR, Laser sensors, ultrasonic sensors, ect.), Machine Learning/AI, human-aided operations and other are applied.
Relevant answer
Answer
To your point and as I think of autonomous or remote-controlled, the differentiation in my mind is having a human driver-in-the-loop for autonomous vehicles. Remotely controlled has access for “in situ” data and an observer to command maneuvers. However, the controls and actuation for an on-ramp entry maneuver into congested freeway, for example, is quite challenging and requires a driver-in-the loop for safe driving operations. As the environment or weather conditions could be classified as degraded, the aforementioned research problem continues to exist. I agree that range and range-rate information for targets in the field-of-view are reported are reported by RADAR; however please expand on sensors necessary for capturing and reporting image data and to provide insights about a degraded visual environment.
  • asked a question related to Sensor Fusion
Question
4 answers
Hello,
I am fusing a certain sensor into an Unscented Kalman Filter (UKF).
This sensor reports a certain measurement including an SNR metric for each measurement.
I am wondering if it is a good idea to keep changing R for this sensor based on the SNR values. However, this would mean too frequent updates of R. or I could adapt R at a lower frequency, like 1Hz or slower.
Or should R be rather kept constant and I simply make use of the sensor's SNR metric to decide whether to use or throw out certain measurements.
Thanks for the advice.
Robert
Relevant answer
Answer
Dear Robert,
I suggest you to see links and attached files in topics.
- The Joint Adaptive Kalman Filter (JAKF) for Vehicle Motion State ...
- New Insights Into the Noise Reduction Wiener Filter
ftp://ftp.esat.kuleuven.be/sista/doclo/reports/04-239.pdf -Measuring Indoor Mobile Wireless Link Quality
- Observer Design and Model Augmentation for Bias ... - DiVA
Best regards
  • asked a question related to Sensor Fusion
Question
3 answers
I want to interface hyper-spectral camera with USB interface and CMOS camera Sony alpha 9 which come with HDMI interface and USB as well with FPGA . I know the implementation of USB 3.0 is difficult in FPGA. Is there any alternative solution, because my constraint is I have to extract data simultaneously from both cameras . My project is related to sensor fusion so timing is important. The whole setup is going to be mounted on UAV
Relevant answer
Answer
Dear Muhammad,
I suggest you to try the FMC connector (FPGA Mezzanine Card):
If you aim to use the USB 3.0 :
Best regards,
Mokhtar
  • asked a question related to Sensor Fusion
Question
4 answers
Currently i am working on Smart Greenhouse development, and i want to know the Combination of sensors like Temperature , Humidity, Camera, Co2 detector, pH etc.... which can support me in Optimal results.
Relevant answer
Answer
To improve crop management, a number of sensors and instruments can (and should) be used to gather information in the greenhouse. Medium and high technology greenhouses make use of a range of sensors which link into automated control systems. These systems can monitor temperature, relative humidity, vapor pressure deficit, light intensity, electrical conductivity (feed and drain), pH (feed and drain), carbon dioxide concentrations, wind speed and direction and even whether or not it is raining. 
  • asked a question related to Sensor Fusion
Question
3 answers
is it same or different for all three? if it differs then how to calculate energy dissipation for unicast, multicast, and broadcast at the same distance?
Scenario: Cluster based wireless sensor network. 
Relevant answer
Answer
You should first find out if your sensor PHY or MAC layers even differentiate between unicast, multicast, and broadcast.
  • asked a question related to Sensor Fusion
Question
3 answers
In case of a crowdsourcing app for monitoring the ambient sound for example. Can we rely on one smartphone or many are required?
Thanks
Relevant answer
Answer
Thank you so much
  • asked a question related to Sensor Fusion
Question
4 answers
Tools of made a new sensor hardware and software for purpose of Underground wireless sensors network?
Relevant answer
Answer
Optimal placement of sensor nodes.
  • asked a question related to Sensor Fusion
Question
5 answers
I want to know what are the techniques currently used in the highway traffic monitoring. like RFID etc
Please suggest me available publication or any mean of sources
  • asked a question related to Sensor Fusion
Question
3 answers
fuzzy
Relevant answer
Answer
Hi Dr. Sandeep,
Assuming that your conventional PID-tuned controller has done a good job in regulating the speed of the SVPWM Inverter Fed Induction Motor Drive, you can generally do better with a Fuzzy PID controller by introducing the some “good nonlinearities” in the system via amendment of the IF-THEN Rules and shaping of the Membership Functions based on the expertise knowledge of a seasoned Engineer in Power Electronics and Motor Drives.
  • asked a question related to Sensor Fusion
Question
10 answers
 I am trying to find the IoT term definition for my research. It seems that it is good opportunity to ask this question once more. In most cases I know the term IoT/IIoT can be replaced by SCADA (Supervisory Control an Data Acquisition) , ICT (Information and Communication Technology) and the text still will be perfectly OK.
Do you think the box (or even pack) of cigarettes could be the “thing”. It has a barcode, so it is the source of data. Is it the sensor - no because in this case the barcode reader (industrial scanner) is the “sensor”. Can we recognize the bar code reader as the “thing” – the answer is not if the goal is to provide GLOBAL cigarettes tracking system. The same applies to drags for example. Is it IoT/IIoT solution - my answer is YES no doubts - it is vital for selected industries.
Is the "thing" smart - I don't think we can call the bar code something smart. The most interesting observation is that we can recognize this case as the IoT solution, but we have not mentioned Internet, wireless, etc.  at all, but only that we have important mobile data and the solution is globally scoped.
Let’s now replace the word GLOBAL by LOCAL (for example cash desks farm in the shop) and the same application is no longer IoT deployment , isn’t it? It is true even if the cash desks are interconnected using IP protocol !
My point is that a good term definition is important to work together on: common rules, architecture, solutions, requirements, capabilities, limitations, etc. The keyword in the previous sentence is COMMON.  Importance of the sensor and data  robustness requirement could be applicable to many applications, e.g. controlling an airplane engine during flight. The same engine could be monitored and tracked after landing in any airport using local WIFI by uploading archival data to a central advanced analytics system. Is it IIoT?  During the flight it isn't, but the solution is life sensitive. After landing it is IIoT, but the reliability of the data and data transfer is not so important, isn't it.
My concern is that your definition  provides pretty good description of the Universe, but working on engineering standards is like carving on the stone - it is one way ticket. To buy one way ticket you must be sure where you are going. 
To be constructive my proposal for the definition is as follows:
Try it against the above example.
In the above proposal the open question is: what is the "mobile data", but I believe that the definition is much closer to the final expectation.  To answer this question I propose this approach:  Data is Data It Doesn’t Matter Where It Comes From!
For implementation of this concept we can use Object Oriented Internet paradigms coved by the:
The only missing thing is how to use these building blocks to make the consistent IoT puzzle (deployment domain). In this case a sponsor is needed to scope globally  the outcome. 
I believe that finally this way we will get good starting point for the further standardization.
Let me know how this scenario works for you.
Relevant answer
Answer
I have a definition according to ITU-T in
IoT –  Internet of Things (Page 4)
Best regards
Anatol Badach
  • asked a question related to Sensor Fusion
Question
6 answers
I am working on a system that needs to detect Road accidents to initiate its processing. i am trying to particularity monitor a human being instead of any vehicle in which he might be traveling. The system is designed in such a way that it can be helpful to even a pedestrian.  I have already considered complete reversal of acceleration as a prime indicator and need other factors that i can use to increase the systems accuracy and reduce false alarm conditions. Since the system needs to be a vehicle independent so can't use the sensors used in, such as air bag systems etc..
Relevant answer
Answer
Basant,
It depends on what it is that is having the accident.
Breaking glass? Acoustic detection.
Broken water pipe? Presence of water in the vicinity.
Broken communication cable? Drop in signal.
etc.
If you can tell us the context, we'll be able to deliver better answers.
  • asked a question related to Sensor Fusion
Question
2 answers
I'm making a real time motion analysis tools by putting IMU on the Arduino Due. How can I set sensors to 0 by using quaternion (calibration)? thanks
4 sensors using serial port. Quternion realtime transmit to matlab by serial port.
totally 16 values for each sample. 
I would like to take 5 second's of data for baseline.  and then use the mean of the 5 s to be 0.
Relevant answer
Answer
Thanks.
  • asked a question related to Sensor Fusion
Question
2 answers
I am trying to perform track-to-track fusion in a asynchronous case. I have great results for monoradar tracks (using IMM (CV, CT, Singer)). But when i starting to fuse tracks I am getting something awful. I do not take into consideration cross-covariances at all. I know it is wrong but these calculations need plots from radars and I have an asynchronous case (periods are different and radars aren't synchronized). How to do it simple and efficient? Is this cross-covariance effect really have THAT huge impact on results?
Relevant answer
Answer
https://confcats_isif.s3.amazonaws.com/web-files/journals/entries/308_0_art_11_27647.pdf - article on Asynchronous Track-To-Track Fusion, though I have not used this article.
What is the difference between the periods of the two radars? Does the prediction of one track sampling time to the other one help, to get to a common sampling time help?
Have you solved the association problem?
For synchronous tracks, I've got good results using the association and fusion equations in article "Multisensor Track-to-Track Association for Tracks with Dependent Errors" - Huimin Chen and Bar-Shalom.
Look for prof. Bar-Shalom articles in the domain, he has some great ones.
  • asked a question related to Sensor Fusion
Question
3 answers
Hello!
I am working on multi-platform, multi-target tracking problem. I am using Interactive multiple models (IMM) [CV + CT + Singer]. My problem is asynchronous (different times of each plot for every platform). Monoplatform processing is fine. I am satisfied with the results. But when I try to fuse my measurements, I get unappropriate quality.
I want to do something like that: extrapolate all local(monoplatform) tracks to one time and fuse them here. Calculation of cross-covariances is quite involved and resourse-consumption for this case. So I've chosen method that was described at Zhen Ding, Lang Hong " A distributed IMM fusion algorithm for multi-platform tracking " (1998). It suggests to use "global" filter and fuse locals using both locals and "global" filters results. (It performes the same procedure with Bar-Shalom - Multitarget-multisensor tracking: principles and techniques p .464 ("Optimal fusion using central estimator") but for IMM filter, not for KF).
My problem is: I want to propagate this article's technique to asyncronous case. I don't understand how to get predicted and filtrated values for local tracks (because I have only predicted value to get all local tracks to the same time). Any suggestions?
The second problem is less complicated. I just don't know what initial value to set on "global filter". 
Relevant answer
Answer
Thanks for the answer!
Didn't find in this arcticle multisensor fusion without using cross-covariances though. Not sure it is possible to combine techniques from this article and "Zhen Ding, Lang Hong " A distributed IMM fusion algorithm for multi-platform tracking " (1998)" to deal with asynchronous IMM.
  • asked a question related to Sensor Fusion
Question
5 answers
Hello Guys,
I knew there are some interesting applications that can make use of embedded accelerometers in smartphones. 
Besides low measurement accuracy and high weight, I guess the important drawback of these applications is their low sampling frequency. I mean, fast vibrations are completely invisible to the applications that I have used so far. Therefore, I just wanted to ask if anyone can introduce an application with a high sampling frequency, and in general, I wonder what is the highest sampling frequency we can expect from these devices?
Thank you in Advance,
Relevant answer
Answer
this article might be helpful.
  • asked a question related to Sensor Fusion
Question
2 answers
Let's say I have a sensor, e.g. a digital encoder, which has a sensitivity s, i.e. the measured value falls in a interval of  +/- s/2 around the true value.
With no other assumption, is it reasonable to model the noise as zero mean Gaussian?
If yes, what variance should it have?
Does exist a rule of thumb, e.g. 3 sigma = s/2 -> sigma = 3/2 s ? 
Relevant answer
Answer
Marco, if the measured value falls in a interval of +/- s/2 around the true value, then s is the maximum error, not the sensitivity. You may say that 3 sigma = s/2 and therefore sigma = s/6. If you are sure there is no bias, then you may represent the sensor error as zero mean Gaussian with the aforementioned sigma.  
  • asked a question related to Sensor Fusion
Question
1 answer
SensorML is a markup language developed for open geographical information systems. There are java libraries, XSchema definitions for validation, OWL support for semantic reasoning. WHen I did a quick browse through it, then it seems as if it is possible to
  • define self-describing sensors
  • define processing (from source to recipient)
  • and much more
To me, this standard seems to be a real contender for interoperability issues in ubiquitous computing, ambient computing and internet of things. So, have you applied it? What is you experience? What are the advantages and disadvantes? Concerning interoperability, are there any real competitors? 
  • asked a question related to Sensor Fusion
Question
11 answers
Hello!
In this question I mean that in my concrete case data is achieved not in batches but as a sequence of measurements. My measurements are being obtained in real-time. So I have to do some actions with every point which has been obtained. And it is hard for me to apply IMM/PDA in a correct way because:
1) It is multiplatform case
2) There is a clutter. PDA can easily deal with it, getting a T time batch of measurements by calculating the association probablities but if I have received 1 point I cant perform PDA well (for example this point can be a clutter point)
What is the way to deal with this? Summarising: I understand IMM/PDA kinda well. But I cant apply it to multiplatform sequential data. How can I deal with it?
Relevant answer
Answer
Do you mean that measurements may arrive out of sequence? So that newer data may arrive at your tracker before older data?
Rather than trying to wrangle with the maths (using prediction and 'retro-diction') could you just push all of your measurements through a buffer? Sort them as they pass through. Then apply the IMM/PDA algorithm on the sorted measurements at the other end? This would apply a lag/latency to your system (determined by the maximum expected transmission delay in your network), but if that is tolerable it might be a simple way of dealing with your problem.
  • asked a question related to Sensor Fusion
Question
3 answers
I am interested in some literature where i could find some information related to state of the art algorithms used for fusion of multiple sensor data available specifically on an avionic bus of an aircraft (combat).  
Relevant answer
Answer
First,it should be know that how many, wich kinds of sensors, and the data types.
Second,  what information need to be fused, and what is the purpose of fusion?
Third, to research available methods of fusion.
Last, simulation and experiment.  
  • asked a question related to Sensor Fusion
Question
9 answers
The problem is I can't use the same procedure as in distributed systems.In DS one can just extrapolate tracks (all tracks that are available at the moment) to the chosen time using KF. I cant use KF in centralised case, because i dont have a velocity. What technique should I use instead? Another problem: How can I solve the following data association problem: for ex. I have 3 radars. I received 2 plots (radar marks). How do i determine whether this object is being followed by 2 radars (in which case i can begin a correlation procedure) or the third plot will be received soon and I should wait for it? Could you recommend any articles with detailed explanation of centralised fusion?
Relevant answer
Answer
In many applications which are not time-critical, there is no need to synchronize time exactly. For example, to localize a static target or a slow-moving target, it is not very necessary to synchronize all sensors in real-time.
For time-synchronization in time-critical cases, one useful way in practice is to add time stamp for each group of data. Each sensor maintains its own clock and all data sent by the sensors are shipped with time stamps. Initially all sensors should be synchronized and in later runs the sensors can be regularly synchronized with a long period, which may rely on the accuracy of crystal oscillator.  Or one alternative way is to encode the time so that data at different time instants will not match according to their encoded time stamp. In this way, exact time stamp may not be needed.
  • asked a question related to Sensor Fusion
Question
3 answers
Greetings,
I am currently working on an application which is aimed at measuring and storing maximum rotation speeds of the device attached to an object (in rotation).
Unfortunately I think I have finally come across a problem - my software recalculates gyro values (angular velocity) into rotational speed (in cycles per minute) - unfortunately my Samsung Note II reaches only 167-168 rot/min.
Can somebody advise where I can find the max value of measurements that such "budget" gyro is able to reach?
Do you know of any method to extend that value?
Kind regards,
Mariusz.
Relevant answer
Answer
I finally manager to sum up the system we have created for measuring the rotational speed of an object. Some initial assumptions:
1. The system must be mobile - in order to achieve this we combined our proprietary sensor with a smartphone.
2. The application is used for recording and evaluating received values of the rotational speed.
3. The application selects the maximum value form 2 second time window which is used for data aggregation.
Unfortunately we have not been able to produce proper results on built-in android sensors. We needed to develop a dedicated sensor solution which have than been integrated through Bluetooth with a smartphone.
 I will attach some dev specs to show how.
  • asked a question related to Sensor Fusion
Question
10 answers
Can we determine the direction of robot by fusing accelerometer and gyroscope outputs? I would like to know if robot is moving forward or reverse on a normal road or even a sloped road without the interference of driving motors/wheels.
Relevant answer
Answer
Yes, you can. You can use a complementary filter to integrate the gyro and accelero data to get the true angle about any axis. I have implemented this on an MPU6050 chip and is giving very accurate results.
  • asked a question related to Sensor Fusion
Question
3 answers
Which methods I use for information fusion problems and Is transferable belief model a method for IF and what else? Do you refer to sensor fusion? What kind of problems to solve are there in sensor fusion applications?
Relevant answer
Answer
Hi Fatma,
First, I assume what you meant by information fusion is the image-fusion field since sensor fusion is a concern. Second, I am not sure of the use of TBM in conjunction with IF.  However, you may wish to take a look at our published article in Optics Express journal regarding sensor fusion (i.e., an optical projection tomography scanner). It is an open access article:
"Improving signal detection in emission optical projection tomography via single source multi-exposure image fusion."
In the article I examined the application of IF on acquired scanner 3D projections of mouse organs, more specifically:
- Performance on a digital phantom
- Revealing vascular branching morphology in the liver
- Exposing ureteric branching patterns in the kidney
I also conducted a comparision between the different available fusion methods, namely: PCA (principal component analysis) fusing method, DWT (discrete wavelet transform) fusing method, SIDWT (shift invariant wavelet transform) fusing method, MP (morphology pyramid) fusing method, Laplacian fusing method, PCNN (pulse coupled neural network) fusing method, Average (arithmetic mean) fusing method,
HDRMD (MATLAB direct high dynamic range) fusion method, HDRMN (MATLAB
normalized intensity high dynamic range) fusion method and IF-OPT (proposed) fusing method.
Good luck!
  • asked a question related to Sensor Fusion
Question
2 answers
I know there are many simulators for a variety of sensor fusion applications, but can you recommend a simulator for a general sensor fusion (if it exist), or some dedicated to body sensors.
Thanks in advance.
Relevant answer
Answer
Thank you very much Marco, absolutely I will read it.
I heard about Castalia, Avrora, SceneGen, Multi-source Report-level Simulator (MRS), FLexible Analysis, Modeling, and Exercise System (FLAMES), vsTasker, SLAMEM, VRForces, Network simulator version two (NS-2), TOSSIM, OMNeT++, J-Sim, QualNet, and some others.
All the above simulators have different purposes and are for a very particular application that distinguished from the others. I'm taking my first steps in this topic, so I look for a simulator that can help me with this purpose. Any simulator that you used in particular and that in your experience recommend?
  • asked a question related to Sensor Fusion
Question
14 answers
Please see attached text document.
Relevant answer
Answer
@Narasim Ramesh
Arduino with corresponding shield modules could be suitable for initial activities related to acquiring of specific knowledge and skills. In the future - if there will be specific requirements for computing resources and power consumption, Arduino could be outperformed by other platforms: https://www.researchgate.net/post/Which_are_the_most_widely_used_motes
  • asked a question related to Sensor Fusion
Question
4 answers
I need linear acceleration along the 3 axis of a smartphone.
I already used the device orientation to project and subtract gravity from accelerometer input. with a correct filter it works well.
However when applying a rotation (without any translation) the output is biased. I believe this is resulting from the tangential (maybe centripetal) acceleration component of the rotation. Using the gyroscope one can calculate those components.
What do you think about that? How should the combination be done. I am a beginner with Kalman filtering (which sound good for this) so any help, ideas, references are very welcome.
Relevant answer
Answer
Hi Elie,
Your problem is known in literature as “lever arm” effect and is generally addressed in inertial navigation system.
If you search on the web you can find different references.
The problem is exactly what you mentioned and is generated by the distance among the centre of rotation and the position of you accelerometers ( r )
The “fictitious” acceleration that you are reading is given by
Af = dw/dt x r + wx(wxr). In the previous equation you have the angular velocity (w), the accelerometer position wrt the centre of rotation ( r ) and the angular acceleration (dw/dt).
Your problem is to determine the ( r ). What you could do is simple imposing a centre of rotation joining your smart phone through whatever mechanical device providing rotation and invert the previous equation to fit the measurement you get from the acc, and gyros. Rotating slowly could be enough to neglect the first term (dw/dt x r) in the equation.
Let me know your opinion
Just a last thing, the equation is a vectorial equation
  • asked a question related to Sensor Fusion
Question
7 answers
Some years ago I used some Rogowski coils but I had a problem because the PTFE coating were burned in very few tests. Now the input power is 5 times greater so the problem must be worse. What kind of coating could I use?
Relevant answer
Answer
Apparently others have used boron nitride .
  • asked a question related to Sensor Fusion
Question
4 answers
I've been trying to cluster time series data received from bunch of sensors. I've multiple trials and I was expecting my algorithm to cluster similar trials together. I'm planning to use Euclidean/Manhattan distance and K means clustering.
Any idea/advice/precaution for clustering numerical data received from Analog/digital sensors? Or a method to gain higher accuracy?
Please give pointwise suggestions, rather than descriptive explanation.
Thank you.
Relevant answer
Answer
@Shivendra:: 1) In order to avoid false positives, it would be better to normalize the collected data.
2)I am not clear about the type of application you are focussing. If possible try to use PIR sensors. There are better PIR sensors than IR Proximity sensor which gives output in the form of digital data, so that the motion can be detected.
3)As you are trying to do K-Means Clustering even small differences in plots makes lot of difference in the classification strategy.
  • asked a question related to Sensor Fusion
Question
10 answers
The mems accelerometer is mounted on a quadcopter. Because the quadcopter is very small and flying indoors, it is basically impossible to use GPS to do sensor fusion. The position information is mainly used as feedback signal for autopilot mode.
I haven't tried Kalman filter or other filter technologies yet. And because of the thermal mechanical error I understand the accuracy couldn't last for a long time. Currently my target is to maintain a relatively accurate position signal in one minute.
Thanks.
Relevant answer
Answer
I agree with the previous comments. Try it. It is alway interesting for academic purpose and to definitively understand (hands-on experience) that double integration of the signal will lead to an acceleration of the drift; the platform will end up moving quite rapidly from the set point. If a short duration flight stabilisation is needed, you could consider (with the platform standing on the ground and after termal stabilisation of the MEMS chip) evaluate the per-axis (or 3D) drift of the position. However, only relying on accelerometer will not work. This is why almost all MEMS IMU pack together 3 axis accelerometer, gyro and compass and yes, all bound together with filters as the Kalman. All this said, go foward and experiment, it is instructive. Oh! if you still consider only the accelerometer, try to find the 3D center of rotation of the platform to minimise level arm effect. Few words about your choices or experiences here will be appreciated!
Regard.
  • asked a question related to Sensor Fusion
Question
11 answers
Every sensor behaves different in different environments. How do you objectively compare them?
Relevant answer
Answer
May be this book chapter can help you :
  • asked a question related to Sensor Fusion
Question
10 answers
If a robot is equipped with multiple sensors e.g. camera, audio, tactile, force/torque etc., how can its position estimation be improved by sensor fusion?
Relevant answer
Answer
There are plenty of filters to deal with your problem.
The main choices are the Kalman filters (Extended Kalman filter, Unscented Kalman filter, etc) and the particle filters.
The first is useful when you have an initial position (or an idea of where it could be) and lineal (or near lineal) measurements.
The second is able to handle not knowing the initial position and having non linear measurements (map matching, etc.), but requires a significant amount of computing compared to the Kalman filters.
In all cases you will need to model the robot states (position, velocity, orientation, etc), estimate their propagation (probably using the odometry) and update the estimations with any measurement you receive.
I recommend the book "Probabilistic Robotics", from Sebastian Thrun, Wolfram Burgard, Dieter Fox on the subject.
  • asked a question related to Sensor Fusion
Question
8 answers
I would like to add lidar sensor model to my vehicle dynamic algorithm. I can model lidar time delay by using quantization techniques. To be more realistic, it is essential to add physics through mathematical equations which govern physical laws. Can someone recommend me some literature or their work which starts from simple equations of lidar to more complex examples?
Relevant answer
Answer
Here is some of my work related to LIDAR simulations, I think the basic principles are formulated there. But as far as I can see, you most propably need not to model all the details, but rather a selection of the most relevant ones to you work. And ofcource adapt your sensor geometry to that you want.
Kukko, A. and J. Hyyppä, 2009. Small-footprint Laser Scanning Simulator for System Validation, Error Assessment and Algorithm Development. Photogrammetric Engineering & Remote Sensing 75(10): 1177-1189.
  • asked a question related to Sensor Fusion
Question
5 answers
This is a two part question as I think they go hand-in-hand.
Is there a way to determine the minimal set of sensors needed for any given exercise? For example, say one needs to detect specific objects, they start off assuming they might need a camera (more than one?), maybe a lidar, maybe a radar, etc. How does one quantify a minimal set of sensors needed to accomplish this task? Initially I'm thinking that it has something to do with the confidence one has in the detection, but I'm trying to quantify that.
Other question: Once a minimal set of sensors has been determined, how does one quantify the improvement gained in the detection by adding another sensor (could be same type, could be different)?
Relevant answer
Answer
Look for solution in nature. We have two eyes instead of one to see things - well except for some insects and other (they have more than two) . We have two ears instead of one. And the reason is not some fancy term ' stereo' but instead a pair will do a job of three or four sensors. Also position of sensors matters too. For example, if I am building a device to trace a sun (to position a solar panel) all I need is a pair of solar panels separated by a wall in between. Therefore if solar panel on left receives more light than on the right, computer then realizes that the sun has moved to the left. In this example, I do not need a sophistication of cameras or other tracking devices.
  • asked a question related to Sensor Fusion
Question
2 answers
Multi-Sensor fusion and Complex Event Processing both address the combination of information originating from different sources. To me it seems that if CEP focuses a little bit more on dynamic events, while multi-sensor fusion is dedicated to a less abstract data-level. However, does anybody know about the exact differences between those two concepts?
Relevant answer
Answer
Thank you very much for your answer. I think another difference could be the point of view. Since sensor fusion is much more dedicated to raw sensory information (in order to generate value-added information of "higher quality"), CEP has a stronger relation to dynamic events. This takes part on the level of decision making. However, it totally agrees with your answer, which mentioned a higher level of abstraction.
  • asked a question related to Sensor Fusion
Question
4 answers
We targeted filtering the output of sensor (kalman filtering) using c#. So we decided to try it in matlab first (because it's simpler in matrix manipulation) ?
So as we have no idea about reading live data (USB protocol) from matlab we have assumed static data .. the problem was however that the simulated data were filtered perfectly, the real data when transferring the same implementation to c# was not as good as the simulation. Any advice?
Relevant answer
Answer
Why you don't use a kind of post processing (Using your Kalman filtering in Matlab) on the real data, in that the raw data was collected previously from the real sensor instead of simulated one.
  • asked a question related to Sensor Fusion
Question
3 answers
I'm trying to fuse the data from a digital compass with 25 Hz of acquisition data rate with the data from a dead reckoning systems which calculates de angulation of a vehicle. I got a little confused while modeling the system, how can I determine my output matrix to get my fused data result?
Relevant answer
Answer
I understand that, but I have two sensors providing [theta] so I need to fuse them into just one... that's the problem