Call for Participation: AERPAW Autonomous UAV Student Challenge

Challenge #1: AERPAW Find A Rover (AFAR) Challenge

The AERPAW platform is planning to host a series of autonomous unmanned aerial vehicle (UAV) student competitions. These competitions will require the use of autonomous navigation, wireless communication, and wireless sensing capabilities in the AERPAW platform. The experimenters will be expected to initially develop and test their UAV and radio frequency (RF) software in AERPAW’s virtual emulation (development) environment during the first round of each competition — see the diagram below. Selected software from the competitors that satisfy minimum success criteria in the emulation environment will then be deployed in the real testbed environment, without any modifications, for the final round of the competition.  

The first challenge is titled “AERPAW Find A Rover (AFAR) Challenge”. In this challenge, the UAV is expected to localize an unmanned ground vehicle (UGV) as accurately and as quickly as possible. A software-defined radio (SDR) carried by the UAV will be continuously receiving a channel-sounding waveform that is based on the GE2 sample experiment in the AERPAW user manual [1]. Antenna patterns for both the transmitter and receiver antennas as well as the geographical map of the environment will also be provided to the participants. The participants can use fixed waypoints for the UAV or they can develop their own trajectory update algorithm for instructing which waypoint the UAV should fly next based on the observed signal strength from the signal source.

(a) A UAV localizing a signal source at a trajectory with fixed waypoints; (b) A UAV making autonomous decisions during the flight to localize a signal source; (c) Measurements at a UAV that show the variation of received power with deep fades due to multipath and 3D antenna radiation, as a function of the distance from the signal source on the ground. 

For a basic example of UAV-based ground signal source localization with fixed waypoints, the participants may check [2], where a high-level overview is also illustrated in the figure above. In Figure (a), it can be seen that as the UAV flies, it accumulates more measurements, and the estimated location of the signal source approaches closer to its true location. How exactly the participants will localize the signal source based on the measurements, e.g., which UAV measurement data will be used in the localization process and using what localization algorithm, is up to the participant. Figure (b) shows an example of an autonomous flight [3] where the UAV changes its trajectory autonomously (using a gradient descent approach) based on the measurements from the signal source. This is different than the approach in Figure (a) where fixed waypoints are used, and artificial intelligence (AI) based approaches can be deployed to improve localization accuracy, by flying the UAV intelligently based on received signal observations. Received power measurements in Figure (c) show that due to ground reflections and antenna characteristics at the transmitter and receiver, the received signal power can vary by as much as 40 dB during the UAV’s flight. In other words, a received power degradation may not necessarily be due to the UAV moving away from the signal source, but it may be due to shadowing and fast fading. These should all be taken into account for designing the navigation algorithm for the UAVs. Antenna radiation patterns for the transmit and receive antennas at the UAV and UGV are also available here, in case the participants decide to use them. 

We will consider two different scoring metrics for evaluating localization performance. 

1) Localization accuracy with fixed flight time (LAFFT): This scoring metric considers the total UAV flight time, given by 10 minutes, for calculating the final location estimate. At the end of the flight duration, the localization error is calculated. The submissions will be ranked based on those providing the smallest localization error within the given time. 

2) Localization time with fixed localization accuracy (LTFLA): This scoring metric considers how quickly the UAV can achieve a threshold localization accuracy, which we set as 10 meters for this competition. The submissions will be ranked based on those providing the shortest localization time while satisfying the accuracy threshold. 

There will be a “single” submission from each team that will be evaluated separately based on the two criteria above. In other words, there will be only a single experiment for each submission. Participating teams can choose to tailor their approach towards one or the other scoring metric, or they can aim to address them simultaneously. Each experiment will be run for a duration of a maximum of 10 minutes. A maximum UAV speed of 10 meters per second will be enforced. AERPAW reserves the right to modify the competition rules if it will improve the competition format. If any rules are changed, they will be updated on this website.

There will be two sets of awards for each of the LAFFT and LTFLA criteria for the top three performing teams: 

1st Place Award: $1500, 2nd Place Award: $1000, and 3rd Place Award: $500

If a team is ranked in first place based on each criterion, they can win up to $3000. The awards for this competition will be sponsored by the IEEE Vehicular Technology Society Ad Hoc Committee on Drones and the AnyMile Platform by Mitsubishi Electric Corporation.

The participants are expected to form teams with up to 4 students and a faculty advisor and apply to participate in the competition by the application deadline provided below. The lead participant should fill out this registration form before the application deadline.  

Important Dates 

  • Application Deadline: May 30, 2023 June 15, 2023
  • First Round Submissions: August 1, 2023 August 15, 2023
  • Final Round (Testbed) Evaluation: September 1, 2023 

Questions related to the competition can be posted on the AERPAW Users Email Forum or directed to


[1] AERPAW GNU Radio Experiment (GE) 2 Channel Sounder Experiment. [Available Online]:

[2] Hyeokjun Kwon and Ismail Guvenc, “RF Signal Source Search and Localization Using an Autonomous UAV with Predefined Waypoints”, to appear in IEEE Vehic. Tech. Conf. (VTC), June 2023. [Available Online]: 

[3] AERPAW Monthly Updates. [Available Online]: