Eternal Flight's goal is to address the problem of the short battery life in Unmanned Aerial Vehicles (UAVs) by allowing the ability to operate continuously without the need to land. This is achieved using a novel system called IFS (In-Flight Switching), in which a larger "parent" drone switches the drained battery of the smaller "child" drone, effectively increasing the child drone's time in flight. The child locates the position of the parent drone using geolocation and positions itself above the parent drone. The child drone then lowers itself slowly and lands on top of the parent drone using computer vision and AprilTags, latching onto the parent drone using electromagnets to prevent further movement. The custom battery switching mechanism slides a new battery into the battery holder on the child drone, and simultaneously pushes out the drained battery. After the battery switching is complete, the child drone is free to take off with a full battery.
BLIPS stands for Bluetooth Low energy indoor positioning system. Our goal is to track the movement of doctors, nurses and equipment in an operating room using Bluetooth Low Energy devices that are placed in the employee's ID cards. 3 - 6 Bluetooth beacons will be placed on the ceilings of the operating room to accurately triangulate the employee's position using RSSI values. This data is then sent to a server which gives real time location tracking of the wearers in the room. An IMU is used to keep our processor in a low power state until movement is detected in order to conserve battery.
Communicating with groups of people during an emergency situation is an extremely important and challenging task. The Cloud Control project aims to create a crowd control solution using a hexacopter drone. The project consists of a ground control application (GCS) which records the user and transmits their voice data to the drone. The drone is equipped with our custom chip and a speaker system which will amplify the voice messages to a target on the ground. The target application is to be able to fly the drone over a single person or a group of people and relay a message to them from the GCS through a microphone. The GCS sends the captured voice data to the chip on the drone over a wireless connection, and the chip processes the data and outputs it through the attached speaker. The potential applications of this project are very broad and include search and rescue operations, active shooter scenarios, and even recreational events like concerts and festivals.
The LGS Drone Scout is a millimeter wave radar system capable of detecting a small UAV or drone in a targeted area. Rotating objects such as the propellers on most drones create distinct micro-doppler signatures in the radar's return signal. By analyzing the change of the frequency components of this signal in relation to time, the existence and certain characteristics of a drone can be deduced. The frequencies and their magnitudes will be displayed in real-time to show the presence of a drone, along with reports of any features specific to the target including: number of rotors, velocity, and size of payload. The analog signals produced by the radar will be sampled by an ADC and processed using a Zynq-7000 SoC.
Hands-On Flight focuses on making human machine interaction natural. Using simple hand/finger gestures, you can control a drone with operation of a glove. The hand/finger movements you perform are sent to and processed by a phone application that will make it possible to control the drone. The movements performed on the glove are sent to the phone application with the use of Bluetooth communication. After being processed by the phone application, the movements are sent to the drone utilizing WiFi communication. The throttle of the drone is controlled by the stretch sensors located on the fingers of the glove. The direction of the drone is controlled by the inertial measurement unit located on the back of the glove. A haptic disk will be used to create vibrations for user feedback. The Hands-On Flight project seeks to lead a path to more intuitive technology.
In many industrial manufacturing processes, an underlying goal is to manufacture products with high yield while ensuring financially viable time-to-market. This is especially true for the semiconductor manufacturing processes where Product Engineers spend significant portions of their time analyzing production data to address sudden yield issues, and/or discover ways to optimize the production yield. Intelligent Engineering Assistant (IEA) is an AI assistant designed to facilitate a Product Engineer's workflow. In the scope of this project, IEA Linguistics, we are extending and enhancing IEA's human-to-machine translation interface by utilizing modern NLP techniques. Specifically, the enhancement focuses on understanding the Product Engineer's conversational intents. Standard and open-source packages alone are not suitable as they can produce a noisy output for some nontechnical domain. With a more robust NLP interface, IEA can receive language instructions and present data analytic results in an automated and interactive fashion, such as using visual plots based on various data types. This automated presentation feature effectively reduced the amount of time spent on mechanical tasks and leaves the Product Engineer with more bandwidth involving critical-thinking tasks.
NASA astronauts are routinely faced with long, complex procedures and often have to call ground control for guidance. This presents a major problem for prospective missions: messages from Mars, for instance, have an up to 20-minute flight-time from one planet to the other. In collaboration with NASA's Ames Research Center, our project, Watchdog, aims to pioneer solutions by using AI to verify astronauts' fidelity to standard operating procedure and offer suggestions during deviation. The project fuses two approaches: neural network-based real-time video semantic analysis and a network of sensors embedded in astronaut equipment. It aims to identify specific subtasks and understand the ordinal relationships between them in a reliable and generalizable way, with the goal of laying the foundation for a more comprehensive autonomous mission guidance system.