Led by Professor Roderich Gross of TU Darmstadt's Department of Computer Science, the team tested this learning-based strategy using a custom multi-agent simulator over eight weeks. Compared to conventional threshold models, their method achieved markedly higher delivery success rates and shorter times. Drones also learned to anticipate tasks they could only complete after recharging, enabling better resource planning.
"This work shows how online learning can help robots cope with real-world challenges, such as operating without full knowledge of their true capabilities," said Dr Mohamed Talamali from The University of Sheffield.
The technique also offers advantages for mixed drone fleets with differing energy profiles due to production variability or usage history. It could lead to more autonomous, energy-efficient delivery systems capable of serving multiple fulfillment centers. "Such autonomous delivery drones could also operate across multiple fulfilment centres, further reducing delivery times and costs," added Gross.
Research Report:Ready, Bid, Go! On-Demand Delivery Using Fleets of Drones with Unknown, Heterogeneous Energy Storage Constraints
Related Links
Technische Universitat Darmstadt
UAV News - Suppliers and Technology
Subscribe Free To Our Daily Newsletters |
Subscribe Free To Our Daily Newsletters |