default search action
Robotics and Autonomous Systems, Volume 172
Volume 172, February 2024
- Matteo Bellusci, Paolo Cudrano, Simone Mentasti, Riccardo Erminio Filippo Cortelazzo, Matteo Matteucci:
Semantic interpretation of raw survey vehicle sensory data for lane-level HD map generation. 104513 - Abhishesh Pal, Antonio Candea Leite, Pål Johan From:
A novel end-to-end vision-based architecture for agricultural human-robot collaboration in fruit picking operations. 104567 - Marios Krestenitis, Emmanuel K. Raptis, Athanasios Ch. Kapoutsis, Konstantinos Ioannidis, Elias B. Kosmatopoulos, Stefanos Vrochidis:
Overcome the Fear Of Missing Out: Active sensing UAV scanning for precision agriculture. 104581 - Spyridon G. Tarantos, Tommaso Belvedere, Giuseppe Oriolo:
Dynamics-aware navigation among moving obstacles with application to ground and flying robots. 104582 - Jelena Gregoric, Marija Seder, Ivan Petrovic:
Autonomous hierarchy creation for computationally feasible near-optimal path planning in large environments. 104584 - Pragyan Dahal, Simone Mentasti, Luca Paparusso, Stefano Arrigoni, Francesco Braghin:
RobustStateNet: Robust ego vehicle state estimation for Autonomous Driving. 104585 - Zhen Deng, Shengzhan Zhang, Yuxin Guo, Hongqi Jiang, Xiaochun Zheng, Bingwei He:
Assisted teleoperation control of robotic endoscope with visual feedback for nasotracheal intubation. 104586 - Wei Guan, Wenzhe Luo, Zhewen Cui:
Intelligent decision-making system for multiple marine autonomous surface ships based on deep reinforcement learning. 104587 - Robin Ferede, Guido de Croon, Christophe De Wagter, Dario Izzo:
End-to-end neural network based optimal quadcopter control. 104588 - Gongcheng Wang, Haofei Ma, Han Wang, Pengchao Ding, Hua Bai, Wenda Xu, Weidong Wang, Zhijiang Du:
Reactive mobile manipulation based on dynamic dual-trajectory tracking. 104589 - Rune Y. Brogaard, Evangelos Boukas:
Autonomous GPU-based UAS for inspection of confined spaces: Application to marine vessel classification. 104590 - Zhen Yu, Xin Jiang, Yunhui Liu:
Pose estimation of an aerial construction robot based on motion and dynamic constraints. 104591 - Arash Marashian, Abolhassan Razminia:
Mobile robot's path-planning and path-tracking in static and dynamic environments: Dynamic programming approach. 104592 - Shuai Zhang, Shiqi Li, You Li, Xiao Li, Zhiguo Wang:
A visual imitation learning algorithm for the selection of robots' grasping points. 104600 - Vinicius Mariano Gonçalves, Dimitris Chaikalis, Anthony Tzes, Farshad Khorrami:
Safe multi-agent drone control using control barrier functions and acceleration fields. 104601 - Chaicharn Akkawutvanich, Frederik Ibsgaard Knudsen, Anders Falk Riis, Jørgen Christian Larsen, Poramate Manoonpong:
Corrigendum to "Adaptive parallel reflex- and decoupled CPG-based control for complex bipedal locomotion" [Robotics and Autonomous Systems 134 (2020) 103663]. 104602 - Hongbiao Zhu, Hua Bai, Pengchao Ding, Ji Zhang, Dongmei Wu, Zhijiang Du, Weidong Wang:
Dual-stage planner for autonomous radioactive source localization in unknown environments. 104603 - Hanfu Wang, Weidong Chen:
Task scheduling for heterogeneous agents pickup and delivery using recurrent open shop scheduling models. 104604 - Seoyeon Kim, Young-Hoon Jung, Hong Min, Taesik Kim, Jinman Jung:
Adaptive sensor management for UGV monitoring based on risk maps. 104605 - Yan Kai Lai, Prahlad Vadakkepat, Cheng Xiang:
R2: Optimal vector-based and any-angle 2D path planning with non-convex obstacles. 104606 - Thi Thoa Mac, Le Minh Quan, Bui Quang Dat, Sy-Tai Nguyen:
A novel hedge algebra formation control for mobile robots. 104607 - Álvaro Ramajo Ballester, José María Armingol Moreno, Arturo de la Escalera Hueso:
Dual license plate recognition and visual features encoding for vehicle identification. 104608
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.