border border

A.L.A.N. for Emergency Drone Landing

Team members

Sumi Boo (EPD), Lim Ken Zho (EPD), Tan Jee Chong (EPD), Timothy Lim Yee Da (EPD), Chung Zhi You (ISTD), Marcus Ho Jun Wei (ISTD), Lim Hng Yi (ISTD)

Instructors:

Cyrille Pierre Joseph Jegourel, Ye Ai

Writing Instructors:

Susan Wong

Teaching Assistant:

Congjian Lin

Project Background


A.L.A.N. (Autonomous Landing and Navigation) is a parachute system that is deployed automatically when catastrophic failure occurs and the drone free falls from the sky. Using computer vision, the parachute will steer the malfunctioned drone to a safe landing zone away from people, vehicles, and property.

navbar test

An all-in-one autonomous solution developed for
navigation and safe landing
of malfunctioned drones.

System Operation Flow

How It Works




Failure Detection
& Deployment

A.L.A.N. is first attached onto a drone. Once it detects that the drone has malfunctioned, it quickly deploys the parachute to slow down the drone's descent

Landing Zone
Detection

A.L.A.N. has a camera that takes in the surrounding landscape and chooses a safe zone to land avoiding humans, vehicles, and properties.

Steering & Landing
In The Zone

A.L.A.N will then steer the parachute to guide the malfunctioned drone to the safe landing zone, protecting the surrounding area from damage



System Overview




Hardware Features

Failure Detection in 0.5s

Equipped with a 3-axis accelerometer, A.L.A.N is able to autonomously detect the freefall of a malfunctioned drone from a height of 30cm within 0.5s

Versatile Parachute

Custom 7-cell Ram-Air parachute allows accurate steering mid-air while providing enough lift to cushion the descent of the malfunctioned drone

Quick Deployment

The deployment bag is fitted with a motor that releases the parachute upon detection of failure in the malfunctioned drone

360 Degree Steering

Our double reel steering system allows minute control over the amount of air in the parachute cells, dictating the direction of the parachute's flight with precision

Software Features




Landscape Segmentation

A.L.A.N uses semantic segmentation UNET architecture to take in video input from the camera and segments them into different classes (e.g. trees, lakes, roads, humans, etc...). They will then be represented by different colours as shown above.

Selection of Landing Zone

Based on the different classes identified in landscape segmentation, A.L.A.N. then uses object detection to select a suitable landing zone which avoids humans, vehicles, and properties to steer towards which is represented by the green outline shown above.







Steering Algorithm

A.L.A.N. then decides which direction to steer based on the position of the landing zone within the frame. If the zone is on the left of the frame, it will steer left and vice versa until it lands within the landing zone.

Human Detection

A.L.A.N. contains a human detection algorithm that locates all the humans in the frame, draws a no fly zone around the humans and ensures that the selected landing zone is far away from any humans in the vicinity.





Demonstration Videos




Failure Detection
& Deployment

Landing Zone
Detection

Steering & Landing
In the Zone



In Collaboration with:

TEAM MEMBERS

student Sumi Boo Engineering Product Development
student Lim Ken Zho Engineering Product Development
student Tan Jee Chong Engineering Product Development
student Timothy Lim Yee Da Engineering Product Development
student Chung Zhi You Information Systems Technology and Design
student Marcus Ho Jun Wei Information Systems Technology and Design
student Lim Hng Yi Information Systems Technology and Design
border border