AERPAW Find-a-Rover (AFAR) Challenge was completed successfully in December 2023. Based on the field testing of submissions from the finalist teams, rankings for the fast estimate and the final estimate are as follows.
Fast (3 Minute) Estimate Ranking
Final (10 Minute) Estimate Ranking
1. Eagles, University of North Texas 2. NYU Wireless, NYU 3. Team SunLab, University of Georgia 4. Team Wolfpack, NC State University 5. Daedalic Wings, NC State University
1. Eagles, University of North Texas 2. Team SunLab, University of Georgia 3. NYU Wireless, NYU 4. Daedalic Wings, NC State University 5. Team Wolfpack, NC State University
Additional details of our evaluation in the development and testbed environments are included here, and further information on individual solution approaches from each team is described here. We would like to congratulate the winning teams for their performance and we thank all five finalist teams for their participation in the AFAR challenge!
Call for Participation: AERPAW Autonomous UAV Student Challenge
Challenge #1: AERPAW Find A Rover (AFAR) Challenge
Summary: The AERPAW platform is planning to host a series of autonomous unmanned aerial vehicle (UAV) student competitions. These competitions will require the use of autonomous navigation, wireless communication, and wireless sensing capabilities in the AERPAW platform. The experimenters will be expected to initially develop and test their UAV and radio frequency (RF) software in AERPAW’s virtual development (digital twin) environment during the first round of each competition. Selected software from the competitors that satisfy minimum success criteria in the digital twin environment will then be deployed in the real testbed environment, without any modifications, for the second (and final) round of the competition. AERPAW is also planning to organize a number of data challenges which will be based on data posted at https://aerpaw.org/experiments/datasets/.
AFAR Challenge Overview: The first challenge is titled “AERPAW Find A Rover (AFAR) Challenge”. In this challenge, the UAV is expected to localize an unmanned ground vehicle (UGV) as accurately and as quickly as possible. A software-defined radio (SDR) carried by the UAV will continuously receive a channel-sounding waveform that is based on the GE2 example experiment in the AERPAW user manual [1]. There is only one transmit antenna and one receiver antenna. The sounding waveform is narrowband and has a bandwidth of 125 KHz. For this particular challenge, the competitors will be expected to use the narrowband waveform provided in the example experiment (GE2) and they will not be allowed to change the waveform parameters at the UGV. Antenna patterns for both the transmitter and receiver antennas as well as the geographical map of the environment are also provided to the participants. The participants can use fixed waypoints for the UAV or they can develop their own trajectory update algorithm for instructing which waypoint the UAV should fly next based on the observed signal strength from the signal source.
Figure (a)
Figure (b)
Figure (c)
(a) A UAV localizing a signal source at a trajectory with fixed waypoints; (b) A UAV making autonomous decisions during the flight to localize a signal source; (c) Measurements at a UAV that show the variation of received power with deep fades due to multipath and 3D antenna radiation, as a function of the distance from the signal source on the ground.
For a basic example of UAV-based ground signal source localization with fixed waypoints, the participants may check [2], where a high-level overview is also illustrated in the figure above. In Figure (a), it can be seen that as the UAV flies, it accumulates more measurements, and the estimated location of the signal source approaches closer to its true location. How exactly the participants will localize the signal source based on the measurements, e.g., which UAV measurement data will be used in the localization process and using what localization algorithm, is up to the participant. Figure (b) shows an example of an autonomous flight [3] where the UAV changes its trajectory autonomously (using a gradient descent approach) based on the measurements from the signal source. This is different than the approach in Figure (a) where fixed waypoints are used, and artificial intelligence (AI) based approaches can be deployed to improve localization accuracy, by flying the UAV intelligently based on received signal observations. Received power measurements in Figure (c) show that due to ground reflections and antenna characteristics at the transmitter and receiver, the received signal power can vary by as much as 40 dB during the UAV’s flight. In other words, a received power degradation may not necessarily be due to the UAV moving away from the signal source, but it may be due to shadowing and fast fading. These should all be taken into account for designing the navigation algorithm for the UAVs. Antenna radiation patterns for the transmit and receive antennas at the UAV and UGV are also available here, in case the participants decide to use them. The participants are free to start with the reference code for autonomous UAV navigation and rover search provided in [4], [5], or implement their own localization algorithm. Additional instructions related to the AFAR challenge are provided in [6].
Scoring Metrics: We will consider two different scoring metrics for evaluating localization performance.
Fast Localization Accuracy (FLA):This scoring metric requires the experiment code to submit (log) a location estimate at 3 minutes from the launch of the UAV, using only the measurements collected within the first 3 minutes. The submissions will be ranked based on those providing the lowest localization error.
Long-Term Localization Accuracy (LTLA): This scoring metric considers the total UAV flight time, given by 10 minutes, for calculating the final location estimate. At the end of the 10-minute flight duration, the experiment code has to submit a final location estimate using any and all measurements. The submissions will be ranked based on those providing the lowest localization error.
Each submitted code from the competitors will be run three times in the development environment by changing of the UGV. The accuracy will be evaluated individually for each UGV location (both for FLA and LTLA criteria) and a linear average will be calculated to obtain the final accuracy metric for ranking purposes. The flight area of the UAV (the trajectory to be designed by the experimenters) will be restricted to the blue rectangle in the image below, while the location of the UGV (for each of the three random deployments) will be restricted to the green rectangle.The experimenters are allowed to change the altitude of the UAV to any altitude between 20 meters and 100 meters.
Bounding longitude and latitude coordinates that define the rectangles above are given as follows.
There will be a “single” submission from each team that will be evaluated separately based on the two criteria above. In other words, there will be only a single experiment for each submission. The submission should include two location estimates: one logged at 3 minutes into the UAV flight, and another logged at 10 minutes into the UAV’s flight.
Participating teams can choose to tailor their approach towards one or the other scoring metric, or they can aim to address them simultaneously.
Each experiment will be run for a maximum of 10 minutes.
A maximum UAV speed of 10 meters per second will be enforced. AERPAW reserves the right to modify the competition rules if it will improve the competition format. If any rules are changed, they will be updated on this website.
Each team should have a faculty advisor who will be responsible for initiating a project in the AERPAW Experimental Web Portal as a PI and invite student team members to participate in the project.
Institution of each team member (regardless of whether in the United States or internationally) to be listed as an organization in the CILogon — this is required for each participant to be authenticated properly, in order to be able to access AERPAW’s experimental portal properly (you can check that by accessing the Experimental Web Portal from aerpaw.org);
Teams can not have more than 4 student members. Graduate and undergraduate students are eligible to participate in this competition.
Competition Awards:
There will be two sets of awards for each of the LAFFT and LTFLA criteria for the top three performing teams:
FLA Criteria Awards:
1st Place Award: $2000, 2nd Place Award: $1250, and 3rd Place Award: $750
LTLA Criteria Awards:
1st Place Award: $2000, 2nd Place Award: $1250, and 3rd Place Award: $750
If a team is ranked in first place based on each criterion, they can win up to $4000. The awards for this competition will be sponsored by Galaxy Unmanned Systems, Unmanned Experts, and the AnyMile Platform by Mitsubishi Electric Corporation.
Important Dates
The participants are expected to form teams with up to 4 students and a faculty advisor and apply to participate in the competition by the application deadline provided below. The lead participant should fill out this registration form within the registration window.
[2] Hyeokjun Kwon and Ismail Guvenc, “RF Signal Source Search and Localization Using an Autonomous UAV with Predefined Waypoints”, to appear in IEEE Vehic. Tech. Conf. (VTC), June 2023. [Available Online]: https://arxiv.org/abs/2301.07027