AI-Driven Multi-Sensor Fusion for Autonomous Robotic Navigation

Main Article Content

Prakash Malvadkar, Amol Bhosale, Shubangi Handore

Abstract

Introduction: Mobile Robotic Systems (MRS) navigate hazardous environments autonomously, requiring precise front wheel angle prediction. This paper proposes a CNN-based multi-sensor fusion framework integrating visual and ultrasonic data for improved decision-making. A Raspberry Pi-controlled prototype utilizes TensorFlow and MATLAB for accurate navigation and task efficiency. Experimental results demonstrate enhanced execution, reliability, and real-time applicability.


Objectives: The objective of this study is to develop a CNN-based multi-sensor fusion framework for Mobile Robotic Systems (MRS) to enhance autonomous navigation in hazardous environments. The proposed system aims to improve front wheel angle prediction, decision-making accuracy, and task efficiency by integrating visual and ultrasonic sensor data. A Raspberry Pi-controlled prototype, utilizing TensorFlow and MATLAB, is implemented to validate the framework’s effectiveness in real-time applications.


Methods: The proposed methodology is novel approach towards interdisciplinary projects like mobile robotic system. Multiple sensor fusion signals are used to generate the data required for completion of decided navigation plan to take final action according to image data with CNN. Whole working area is divided in to different navigation plan. The action taken for the last navigation plan is use to train the machine for completion of next task. The final decision getting from the input data from the different sensors to train AI algorithm of the machine. From this paper the movement of sequential task completing robot is done with sequential steps.1st step is scanning of surrounding environment is done with ultrasonic sensor and using of surrounding data for 1st navigation map planning. With the help of surrounding particles in the environment NP(navigation plan) is designed. Auto tuning of front wheel angle and distance decided with the help of NP formed. Pid controlling is used for sequential operation of MRS.


Results: The experimental results demonstrate the effectiveness of the proposed AI-driven multi-sensor fusion framework for autonomous robotic navigation. The system successfully integrates ultrasonic sensors and a camera module with a Raspberry Pi, enabling precise front wheel angle adjustments and efficient navigation planning. Using CNN-based image processing and real-time sensor fusion, the robotic vehicle achieved improved obstacle detection accuracy and reduced decision-making latency. Performance metrics indicate a significant enhancement in navigation precision, power efficiency, and adaptability compared to traditional methods. The results validate the feasibility of this approach for real-time applications in hazardous environments.


Conclusions:. This study presents a novel approach for single-step sensor fusion and decision-making within autonomous robotic systems. The proposed methodology addresses the challenge of continuous operation in complex robotic tasks by breaking down the process into discrete steps. Each step involves independent functioning, significantly reducing the complexity of system integration. By focusing on a modular approach, where each sensor fusion and decision-making process is handled separately, the method ensures minimal integration while maintaining operational efficiency.

Article Details

Section
Articles