Combining Vision and Smell for Robot Odor Detection
Author Information
Author(s): Hassan Sunzid, Wang Lingxiao, Mahmud Khan Raqib
Primary Institution: Louisiana Tech University
Hypothesis
Can integrating vision and olfaction improve robotic odor source localization in complex environments?
Conclusion
The proposed algorithm successfully outperformed traditional methods in locating odor sources in both unidirectional and non-unidirectional airflow environments.
Supporting Evidence
- The proposed algorithm achieved a 100% success rate in unidirectional airflow environments.
- In non-unidirectional airflow environments, the proposed method outperformed the Fusion algorithm in terms of success rate and search time.
- The integration of vision and olfaction allowed for better navigation in complex environments.
Takeaway
This study shows that robots can find smells better when they use both their eyes and noses together, especially in tricky situations.
Methodology
The study implemented a multi-modal LLM-based navigation algorithm on a mobile robot, testing its performance in real-world environments with various airflow conditions.
Limitations
The inference time of the LLM is three seconds, and the experiments were conducted in a small-scale environment.
Statistical Information
P-Value
p<0.05
Statistical Significance
p<0.05
Digital Object Identifier (DOI)
Want to read the original?
Access the complete publication on the publisher's website