NU-AIR: A Neuromorphic Urban Aerial Dataset for Detection and Localization of Pedestrians and Vehicles

Craig Iaboni, Thomas Kelly, Pramod Abichandani

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents an open-source aerial neuromorphic dataset that captures pedestrians and vehicles moving in an urban environment. The dataset, titled NU-AIR, features over 70 min of event footage acquired with a 640 × 480 resolution neuromorphic sensor mounted on a quadrotor operating in an urban environment. Crowds of pedestrians, different types of vehicles, and street scenes featuring busy urban environments are captured at different elevations and illumination conditions. Manual bounding box annotations of vehicles and pedestrians contained in the recordings are provided at a frequency of 30 Hz, yielding more than 93,000 labels in total. A baseline evaluation for this dataset was performed using three Spiking Neural Networks (SNNs) and ten Deep Neural Networks (DNNs). All data and Python code to voxelize the data and subsequently train SNNs/DNNs has been open-sourced.

Original languageEnglish (US)
Article numbere0217049
JournalInternational Journal of Computer Vision
DOIs
StateAccepted/In press - 2025

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Keywords

  • Computer vision
  • Dataset
  • Event cameras
  • Spiking neural networks
  • UAVs

Fingerprint

Dive into the research topics of 'NU-AIR: A Neuromorphic Urban Aerial Dataset for Detection and Localization of Pedestrians and Vehicles'. Together they form a unique fingerprint.

Cite this