Project Details
Description
The Metis team will provide an integrated, state-of-the-art system of autonomous vehicles and intelligent software to deliver supplies and aid where they are needed most.
The solution will reduce the risks associated with complex humanitarian resupply operations, where aid workers can be exposed to significant dangers. Using advanced autonomously-navigating air and land vehicles, aid missions can be conducted at greater speed, ensuring critical items – such as food, water, medicines and shelter – can reach those in need. It will also support troops on future military operations, saving lives by reducing risk and increasing the pace of operations.
The Field and Applied Robotics team of Aberystwyth University developed the autonomous road driving module that allows ground vehicles to drive on unmarked, ill-defined roads (often no more than tracks across a field). The system is integrated in the wider navigation system and, when appropriate, is instructed to detect roads and drive on them the vehicle needs to leave the road.
The system allows for faster driving than would be allowed using GNSS-based navigation only by exploiting visual and 3D data captured using a 3D camera. The only assumption made by the system is that the road area is different than the non-road areas in the modalities used (colour and height of the ground).
The solution will reduce the risks associated with complex humanitarian resupply operations, where aid workers can be exposed to significant dangers. Using advanced autonomously-navigating air and land vehicles, aid missions can be conducted at greater speed, ensuring critical items – such as food, water, medicines and shelter – can reach those in need. It will also support troops on future military operations, saving lives by reducing risk and increasing the pace of operations.
The Field and Applied Robotics team of Aberystwyth University developed the autonomous road driving module that allows ground vehicles to drive on unmarked, ill-defined roads (often no more than tracks across a field). The system is integrated in the wider navigation system and, when appropriate, is instructed to detect roads and drive on them the vehicle needs to leave the road.
The system allows for faster driving than would be allowed using GNSS-based navigation only by exploiting visual and 3D data captured using a 3D camera. The only assumption made by the system is that the road area is different than the non-road areas in the modalities used (colour and height of the ground).
Status | Finished |
---|---|
Effective start/end date | 01 Jun 2018 → 31 May 2019 |
Funding
- QinetiQ (Funder reference unknown): £88,452.00
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.