Fusion of Imaging and Inertial Sensors for Navigation PDF Download

Are you looking for read ebook online? Search for your book and save it on your Kindle device, PC, phones or tablets. Download Fusion of Imaging and Inertial Sensors for Navigation PDF full book. Access full book title Fusion of Imaging and Inertial Sensors for Navigation by Michael J. Veth. Download full books in PDF and EPUB format.

Fusion of Imaging and Inertial Sensors for Navigation

Fusion of Imaging and Inertial Sensors for Navigation PDF Author: Michael J. Veth
Publisher:
ISBN: 9780542834059
Category : Artificial satellites in navigation
Languages : en
Pages : 191

Book Description
The introduction of the Global Positioning System changed the way the United States Air Force fights, by delivering world-wide, precision navigation capability to even the smallest platforms. Unfortunately, the Global Positioning System signal is not available in all combat environments (e.g., under tree cover, indoors, or underground). Thus, operations in these environments are limited to non-precision tactics. The motivation of this research is to address the limitations of the current precision navigation methods by fusing imaging and inertial systems, which is inspired by observing the navigation capabilities of animals. The research begins by rigorously describing the imaging and navigation problem and developing practical models of the sensors, then presenting a transformation technique to detect features within an image. Given a set of features, a rigorous, statistical feature projection technique is developed which utilizes inertial measurements to predict vectors in the feature space between images. This coupling of the imaging and inertial sensors at a deep level is then used to aid the statistical feature matching function. The feature matches and inertial measurements are then used to estimate the navigation trajectory online using an extended Kalman filter. After accomplishing a proper calibration, the image-aided inertial navigation algorithm is then tested using a combination of simulation and ground tests using both tactical and consumer-grade inertial sensors. While limitations of the extended Kalman filter are identified, the experimental results demonstrate a navigation performance improvement of at least two orders of magnitude over the respective inertial-only solutions.

Fusion of Imaging and Inertial Sensors for Navigation

Fusion of Imaging and Inertial Sensors for Navigation PDF Author: Michael J. Veth
Publisher:
ISBN: 9780542834059
Category : Artificial satellites in navigation
Languages : en
Pages : 191

Book Description
The introduction of the Global Positioning System changed the way the United States Air Force fights, by delivering world-wide, precision navigation capability to even the smallest platforms. Unfortunately, the Global Positioning System signal is not available in all combat environments (e.g., under tree cover, indoors, or underground). Thus, operations in these environments are limited to non-precision tactics. The motivation of this research is to address the limitations of the current precision navigation methods by fusing imaging and inertial systems, which is inspired by observing the navigation capabilities of animals. The research begins by rigorously describing the imaging and navigation problem and developing practical models of the sensors, then presenting a transformation technique to detect features within an image. Given a set of features, a rigorous, statistical feature projection technique is developed which utilizes inertial measurements to predict vectors in the feature space between images. This coupling of the imaging and inertial sensors at a deep level is then used to aid the statistical feature matching function. The feature matches and inertial measurements are then used to estimate the navigation trajectory online using an extended Kalman filter. After accomplishing a proper calibration, the image-aided inertial navigation algorithm is then tested using a combination of simulation and ground tests using both tactical and consumer-grade inertial sensors. While limitations of the extended Kalman filter are identified, the experimental results demonstrate a navigation performance improvement of at least two orders of magnitude over the respective inertial-only solutions.

Fusion of Low-Cost Imaging and Inertial Sensors for Navigation

Fusion of Low-Cost Imaging and Inertial Sensors for Navigation PDF Author:
Publisher:
ISBN:
Category :
Languages : en
Pages : 12

Book Description
Aircraft navigation information (position, velocity, and attitude) can be determined using optical measurements from imaging sensors combined with an inertial navigation system. This can be accomplished by tracking the locations of optical features in multiple images and using the resulting geometry to estimate and remove inertial errors. A critical factor governing the performance of image-aided inertial navigation systems is the robustness of the feature tracking algorithm. Previous research has shown the strength of rigorously coupling the image and inertial sensors at the measurement level using a tactical-grade inertial sensor. While the tactical-grade inertial sensor is a reasonable choice for larger platforms, the greater physical size and cost of the sensor limits its use in smaller, low-cost platforms. In this paper, an image-aided inertial navigation algorithm is implemented using a multi-dimensional stochastic feature tracker. In contrast to previous research, the algorithms are specifically evaluated for operation using lowcost, CMOS imagers and MEMS inertial sensors. The performance of the resulting image-aided inertial navigation system is evaluated using Monte Carlo simulation and experimental data and compared to the performance using more expensive inertial sensors.

Real-Time Fusion of Image and Inertial Sensors for Navigation

Real-Time Fusion of Image and Inertial Sensors for Navigation PDF Author: J. Fletcher
Publisher:
ISBN:
Category :
Languages : en
Pages : 13

Book Description
As evidenced by many biological systems, the fusion of optical and inertial sensors represents an attractive method for passive navigation. In our previous work, a rigorous theory for optical and inertial fusion was developed for precision navigation applications. The theory was based on a statistical transformation of the feature space based on inertial sensor measurements. The transformation effectively constrained the feature correspondence search to a given level of a priori statistical uncertainty. When integrated into a navigation system, the fused system demonstrated performance in indoor environments which were comparable to that of GPS-aided systems. In order to improve feature tracking performance, a robust feature transformation algorithm "Lowe?s SIFT" was chosen. The SIFT features are ideal for navigation applications in that they are invariant to scale, rotation, and illumination. Unfortunately, there exists a correlation between feature complexity and computer processing time. This limits the effectiveness of robust feature extraction algorithms for real-time applications using traditional microprocessor architectures. While recent advances in computer technology have made image processing more commonplace, the amount of information that can be processed is still limited by the power and speed of the CPU. In this paper, a new theory which exploits the highly parallel nature of General Programmable Graphical Processing Units "GPGPU" is developed which supports deeply integrated optical and inertial sensors for real-time navigation. Recent advances in GPGPU technology have made realtime, image-aided navigation a reality. Our approach leverages the existing OpenVIDIA core GPGPU library and commercially available computer hardware to solve the image and inertial fusion problem. The open-source libraries are extended to include the statistical featur.

Spacecraft Autonomous Navigation Technologies Based on Multi-source Information Fusion

Spacecraft Autonomous Navigation Technologies Based on Multi-source Information Fusion PDF Author: Dayi Wang
Publisher: Springer Nature
ISBN: 981154879X
Category : Technology & Engineering
Languages : en
Pages : 352

Book Description
This book introduces readers to the fundamentals of estimation and dynamical system theory, and their applications in the field of multi-source information fused autonomous navigation for spacecraft. The content is divided into two parts: theory and application. The theory part (Part I) covers the mathematical background of navigation algorithm design, including parameter and state estimate methods, linear fusion, centralized and distributed fusion, observability analysis, Monte Carlo technology, and linear covariance analysis. In turn, the application part (Part II) focuses on autonomous navigation algorithm design for different phases of deep space missions, which involves multiple sensors, such as inertial measurement units, optical image sensors, and pulsar detectors. By concentrating on the relationships between estimation theory and autonomous navigation systems for spacecraft, the book bridges the gap between theory and practice. A wealth of helpful formulas and various types of estimators are also included to help readers grasp basic estimation concepts and offer them a ready-reference guide.

Optimal Image-Aided Inertial Navigation

Optimal Image-Aided Inertial Navigation PDF Author: Nilesh Sharma Gopaul
Publisher:
ISBN:
Category :
Languages : en
Pages : 0

Book Description
The utilization of cameras in integrated navigation systems is among the most recent scientific research and high-tech industry development. The research is motivated by the requirement of calibrating off-the-shelf cameras and the fusion of imaging and inertial sensors in poor GNSS environments. The three major contributions of this dissertation are The development of a structureless camera auto-calibration and system calibration algorithm for a GNSS, IMU and stereo camera system. The auto-calibration bundle adjustment utilizes the scale restraint equation, which is free of object coordinates. The number of parameters to be estimated is significantly reduced in comparison with the ones in a self-calibrating bundle adjustment based on the collinearity equations. Therefore, the proposed method is computationally more efficient. The development of a loosely-coupled visual odometry aided inertial navigation algorithm. The fusion of the two sensors is usually performed using a Kalman filter. The pose changes are pairwise time-correlated, i.e. the measurement noise vector at the current epoch is only correlated with the one from the previous epoch. Time-correlated errors are usually modelled by a shaping filter. The shaping filter developed in this dissertation uses Cholesky factors as coefficients derived from the variance and covariance matrices of the measurement noise vectors. Test results with showed that the proposed algorithm performs better than the existing ones and provides more realistic covariance estimates. The development of a tightly-coupled stereo multi-frame aided inertial navigation algorithm for reducing position and orientation drifts. Usually, the image aiding based on the visual odometry uses the tracked features only from a pair of the consecutive image frames. The proposed method integrates the features tracked from multiple overlapped image frames for reducing the position and orientation drifts. The measurement equation is derived from SLAM measurement equation system where the landmark positions in SLAM are algebraically by time-differencing. However, the derived measurements are time-correlated. Through a sequential de-correlation, the Kalman filter measurement update can be performed sequentially and optimally. The main advantages of the proposed algorithm are the reduction of computational requirements when compared to SLAM and a seamless integration into an existing GNSS aided-IMU system.

Tightly-Coupled Image-Aided Inertial Navigation Using the Unscented Kalman Filter

Tightly-Coupled Image-Aided Inertial Navigation Using the Unscented Kalman Filter PDF Author: S. Ebcin
Publisher:
ISBN:
Category :
Languages : en
Pages : 12

Book Description
Accurate navigation information "position, velocity, and attitude" can be determined using optical measurements from imaging sensors combined with an inertial navigation system. This can be accomplished by tracking the locations of stationary optical features in multiple images and using the resulting geometry to estimate and remove inertial errors. In previous research efforts, we have demonstrated the effectiveness of fusing imaging and inertial sensors using an extended Kalman filter "EKF" algorithm. In this approach, the image feature correspondence search was aided using the inertial sensor measurements, resulting in more robust feature tracking. The resulting image-aided inertial algorithm was tested using both simulation and experimental data. While the tightly-coupled approach stabilized the feature correspondence search, the overall problem remained prone to filter divergence due to the well-known consequences of image scale ambiguity and the nonlinear measurement model. These effects are evidenced by the consistency divergence in the EKF implementation seen during our longdurationMonte-Carlo simulations. In other words, the measurement model is highly sensitive to the current parameter estimate, which invalidates the linearized measurement model assumed by the EKF. The unscented "sigma-point" Kalman filter "UKF" has been proposed in the literature in order to address the large class of recursive estimation problems which are not well-modeled using linearized dynamics and Gaussian noise models assumed in the EKF. The UKF leverages the unscented transformation in order to represent the state uncertainty using a set of carefully chosen sample points. This approach maintains mean and covariance estimates accurate to at least second order, by using the true nonlinear dynamics and measurement models. In this paper, a variation of the UKF is applied to the.

Intelligent Information Processing for Inertial-Based Navigation Systems

Intelligent Information Processing for Inertial-Based Navigation Systems PDF Author: Chong Shen
Publisher: Springer Nature
ISBN: 9813345160
Category : Technology & Engineering
Languages : en
Pages : 131

Book Description
This book introduces typical inertial devices and inertial-based integrated navigation systems, gyro noise suppression, gyro temperature drift error modeling compensation, inertial-based integrated navigation systems under discontinuous observation conditions, and inertial-based brain integrated navigation systems. Integrated navigation is the result of the development of modern navigation theory and technology. The inertial navigation system has the advantages of strong autonomy, high short-term accuracy, all-day time, all weather, and so on. And it has been applied in most integrated navigation systems. Among them, the information processing of inertial-based integrated navigation system is the core technology. Due to the effect of the device mechanism and working environment, there are errors in the output information of the inertial-based integrated navigation system, including gyroscope noise, temperature drift, and discontinuous observations, which will seriously reduce the accuracy and robustness of the system. And the book helps readers to solve these problems. The intelligent information processing technology involved is equipped with simulation verification, which can be used as a reference for undergraduate, graduate, and Ph.D. students, and also scientific researchers or engineers engaged in navigation-related specialties.

Navigation and Mapping for Aerial Vehicles Based on Inertial and Imaging Sensors

Navigation and Mapping for Aerial Vehicles Based on Inertial and Imaging Sensors PDF Author:
Publisher:
ISBN: 9789175195537
Category : Aerial observation (Military science)
Languages : en
Pages : 82

Book Description
Small and medium sized Unmanned Aerial Vehicles (UAV) are today used in military missions, and will in the future find many new application areas such as surveillance for exploration and security. To enable all these foreseen applications, the UAV's have to be cheap and of low weight, which restrict the sensors that can be used for navigation and surveillance. This thesis investigates several aspects of how fusion of navigation and imaging sensors can improve both tasks at a level that would require much more expensive sensors with the traditional approach of separating the navigation system from the applications. The core idea is that vision sensors can support the navigation system by providing odometric information of the motion, while the navigation system can support the vision algorithms, used to map the surrounding environment, to be more efficient. The unified framework for this kind of approach is called Simultaneous Localisation and Mapping (SLAM) and it will be applied here to inertial sensors, radar and optical camera.

Position, Navigation, and Timing Technologies in the 21st Century

Position, Navigation, and Timing Technologies in the 21st Century PDF Author: Y. Jade Morton
Publisher: John Wiley & Sons
ISBN: 1119458498
Category : Science
Languages : en
Pages : 902

Book Description
Covers the latest developments in PNT technologies, including integrated satellite navigation, sensor systems, and civil applications Featuring sixty-four chapters that are divided into six parts, this two-volume work provides comprehensive coverage of the state-of-the-art in satellite-based position, navigation, and timing (PNT) technologies and civilian applications. It also examines alternative navigation technologies based on other signals-of-opportunity and sensors and offers a comprehensive treatment on integrated PNT systems for consumer and commercial applications. Volume 1 of Position, Navigation, and Timing Technologies in the 21st Century: Integrated Satellite Navigation, Sensor Systems, and Civil Applications contains three parts and focuses on the satellite navigation systems, technologies, and engineering and scientific applications. It starts with a historical perspective of GPS development and other related PNT development. Current global and regional navigation satellite systems (GNSS and RNSS), their inter-operability, signal quality monitoring, satellite orbit and time synchronization, and ground- and satellite-based augmentation systems are examined. Recent progresses in satellite navigation receiver technologies and challenges for operations in multipath-rich urban environment, in handling spoofing and interference, and in ensuring PNT integrity are addressed. A section on satellite navigation for engineering and scientific applications finishes off the volume. Volume 2 of Position, Navigation, and Timing Technologies in the 21st Century: Integrated Satellite Navigation, Sensor Systems, and Civil Applications consists of three parts and addresses PNT using alternative signals and sensors and integrated PNT technologies for consumer and commercial applications. It looks at PNT using various radio signals-of-opportunity, atomic clock, optical, laser, magnetic field, celestial, MEMS and inertial sensors, as well as the concept of navigation from Low-Earth Orbiting (LEO) satellites. GNSS-INS integration, neuroscience of navigation, and animal navigation are also covered. The volume finishes off with a collection of work on contemporary PNT applications such as survey and mobile mapping, precision agriculture, wearable systems, automated driving, train control, commercial unmanned aircraft systems, aviation, and navigation in the unique Arctic environment. In addition, this text: Serves as a complete reference and handbook for professionals and students interested in the broad range of PNT subjects Includes chapters that focus on the latest developments in GNSS and other navigation sensors, techniques, and applications Illustrates interconnecting relationships between various types of technologies in order to assure more protected, tough, and accurate PNT Position, Navigation, and Timing Technologies in the 21st Century: Integrated Satellite Navigation, Sensor Systems, and Civil Applications will appeal to all industry professionals, researchers, and academics involved with the science, engineering, and applications of position, navigation, and timing technologies. pnt21book.com

Continuous Models for Cameras and Inertial Sensors

Continuous Models for Cameras and Inertial Sensors PDF Author: Hannes Ovrén
Publisher: Linköping University Electronic Press
ISBN: 917685244X
Category :
Languages : en
Pages : 67

Book Description
Using images to reconstruct the world in three dimensions is a classical computer vision task. Some examples of applications where this is useful are autonomous mapping and navigation, urban planning, and special effects in movies. One common approach to 3D reconstruction is ”structure from motion” where a scene is imaged multiple times from different positions, e.g. by moving the camera. However, in a twist of irony, many structure from motion methods work best when the camera is stationary while the image is captured. This is because the motion of the camera can cause distortions in the image that lead to worse image measurements, and thus a worse reconstruction. One such distortion common to all cameras is motion blur, while another is connected to the use of an electronic rolling shutter. Instead of capturing all pixels of the image at once, a camera with a rolling shutter captures the image row by row. If the camera is moving while the image is captured the rolling shutter causes non-rigid distortions in the image that, unless handled, can severely impact the reconstruction quality. This thesis studies methods to robustly perform 3D reconstruction in the case of a moving camera. To do so, the proposed methods make use of an inertial measurement unit (IMU). The IMU measures the angular velocities and linear accelerations of the camera, and these can be used to estimate the trajectory of the camera over time. Knowledge of the camera motion can then be used to correct for the distortions caused by the rolling shutter. Another benefit of an IMU is that it can provide measurements also in situations when a camera can not, e.g. because of excessive motion blur, or absence of scene structure. To use a camera together with an IMU, the camera-IMU system must be jointly calibrated. The relationship between their respective coordinate frames need to be established, and their timings need to be synchronized. This thesis shows how to automatically perform this calibration and synchronization, without requiring e.g. calibration objects or special motion patterns. In standard structure from motion, the camera trajectory is modeled as discrete poses, with one pose per image. Switching instead to a formulation with a continuous-time camera trajectory provides a natural way to handle rolling shutter distortions, and also to incorporate inertial measurements. To model the continuous-time trajectory, many authors have used splines. The ability for a spline-based trajectory to model the real motion depends on the density of its spline knots. Choosing a too smooth spline results in approximation errors. This thesis proposes a method to estimate the spline approximation error, and use it to better balance camera and IMU measurements, when used in a sensor fusion framework. Also proposed is a way to automatically decide how dense the spline needs to be to achieve a good reconstruction. Another approach to reconstruct a 3D scene is to use a camera that directly measures depth. Some depth cameras, like the well-known Microsoft Kinect, are susceptible to the same rolling shutter effects as normal cameras. This thesis quantifies the effect of the rolling shutter distortion on 3D reconstruction, depending on the amount of motion. It is also shown that a better 3D model is obtained if the depth images are corrected using inertial measurements. Att använda bilder för att återskapa världen omkring oss i tre dimensioner är ett klassiskt problem inom datorseende. Några exempel på användningsområden är inom navigering och kartering för autonoma system, stadsplanering och specialeffekter för film och spel. En vanlig metod för 3D-rekonstruktion är det som kallas ”struktur från rörelse”. Namnet kommer sig av att man avbildar (fotograferar) en miljö från flera olika platser, till exempel genom att flytta kameran. Det är därför något ironiskt att många struktur-från-rörelse-algoritmer får problem om kameran inte är stilla när bilderna tas, exempelvis genom att använda sig av ett stativ. Anledningen är att en kamera i rörelse ger upphov till störningar i bilden vilket ger sämre bildmätningar, och därmed en sämre 3D-rekonstruktion. Ett välkänt exempel är rörelseoskärpa, medan ett annat är kopplat till användandet av en elektronisk rullande slutare. I en kamera med rullande slutare avbildas inte alla pixlar i bilden samtidigt, utan istället rad för rad. Om kameran rör på sig medan bilden tas uppstår därför störningar i bilden som måste tas om hand om för att få en bra rekonstruktion. Den här avhandlingen berör robusta metoder för 3D-rekonstruktion med rörliga kameror. En röd tråd inom arbetet är användandet av en tröghetssensor (IMU). En IMU mäter vinkelhastigheter och accelerationer, och dessa mätningar kan användas för att bestämma hur kameran har rört sig över tid. Kunskap om kamerans rörelse ger möjlighet att korrigera för störningar på grund av den rullande slutaren. Ytterligare en fördel med en IMU är att den ger mätningar även i de fall då en kamera inte kan göra det. Exempel på sådana fall är vid extrem rörelseoskärpa, starkt motljus, eller om det saknas struktur i bilden. Om man vill använda en kamera tillsammans med en IMU så måste dessa kalibreras och synkroniseras: relationen mellan deras respektive koordinatsystem måste bestämmas, och de måste vara överens om vad klockan är. I den här avhandlingen presenteras en metod för att automatiskt kalibrera och synkronisera ett kamera-IMU-system utan krav på exempelvis kalibreringsobjekt eller speciella rörelsemönster. I klassisk struktur från rörelse representeras kamerans rörelse av att varje bild beskrivs med en kamera-pose. Om man istället representerar kamerarörelsen som en tidskontinuerlig trajektoria kan man på ett naturligt sätt hantera problematiken kring rullande slutare. Det gör det också enkelt att införa tröghetsmätningar från en IMU. En tidskontinuerlig kameratrajektoria kan skapas på flera sätt, men en vanlig metod är att använda sig av så kallade splines. Förmågan hos en spline att representera den faktiska kamerarörelsen beror på hur tätt dess knutar placeras. Den här avhandlingen presenterar en metod för att uppskatta det approximationsfel som uppkommer vid valet av en för gles spline. Det uppskattade approximationsfelet kan sedan användas för att balansera mätningar från kameran och IMU:n när dessa används för sensorfusion. Avhandlingen innehåller också en metod för att bestämma hur tät en spline behöver vara för att ge ett gott resultat. En annan metod för 3D-rekonstruktion är att använda en kamera som också mäter djup, eller avstånd. Vissa djupkameror, till exempel Microsoft Kinect, har samma problematik med rullande slutare som vanliga kameror. I den här avhandlingen visas hur den rullande slutaren i kombination med olika typer och storlekar av rörelser påverkar den återskapade 3D-modellen. Genom att använda tröghetsmätningar från en IMU kan djupbilderna korrigeras, vilket visar sig ge en bättre 3D-modell.