Automatic road detection in traffic videos

Hadi Ghahremannezhad, Hang Shi, Chengjun Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

13 Scopus citations

Abstract

Automatic road detection is a challenging and representative computer vision problem due to a wide range of illumination variations and weather conditions in real traffic. This paper presents a novel real-time road detection method that is able to accurately and robustly extract the road region in real traffic videos under adverse illumination and weather conditions. Specifically, the innovative global foreground modeling (GFM) method is first applied to robustly model the ever-changing background in the traffic as well as to accurately detect the regions of the moving objects, namely the vehicles on the road. Note that the regions of the moving vehicles are reasonably assumed to be the road regions, which are then utilized to generate in total seven probability maps. In particular, four of these maps are derived using the color values in the RGB and HSV color spaces. Two additional probability maps are calculated from the two normalized histograms corresponding to the road and the non-road pixels in the RGB and grayscale color spaces, respectively. The last probability map is computed from the edges detected by the Canny edge detector and the regions located by the flood-fill algorithm. Finally, a novel automatic road detection method, which integrates these seven masks based on their probability values, defines a final probability mask for accurate and robust road detection in video.

Original languageEnglish (US)
Title of host publicationProceedings - 2020 IEEE International Symposium on Parallel and Distributed Processing with Applications, 2020 IEEE International Conference on Big Data and Cloud Computing, 2020 IEEE International Symposium on Social Computing and Networking and 2020 IEEE International Conference on Sustainable Computing and Communications, ISPA-BDCloud-SocialCom-SustainCom 2020
EditorsJia Hu, Geyong Min, Nektarios Georgalas, Zhiwei Zhao, Fei Hao, Wang Miao
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages777-784
Number of pages8
ISBN (Electronic)9781665414852
DOIs
StatePublished - Dec 2020
Externally publishedYes
Event18th IEEE International Symposium on Parallel and Distributed Processing with Applications, 10th IEEE International Conference on Big Data and Cloud Computing, 13th IEEE International Symposium on Social Computing and Networking and 10th IEEE International Conference on Sustainable Computing and Communications, ISPA-BDCloud-SocialCom-SustainCom 2020 - Virtual, Exeter, United Kingdom
Duration: Dec 17 2020Dec 19 2020

Publication series

NameProceedings - 2020 IEEE International Symposium on Parallel and Distributed Processing with Applications, 2020 IEEE International Conference on Big Data and Cloud Computing, 2020 IEEE International Symposium on Social Computing and Networking and 2020 IEEE International Conference on Sustainable Computing and Communications, ISPA-BDCloud-SocialCom-SustainCom 2020

Conference

Conference18th IEEE International Symposium on Parallel and Distributed Processing with Applications, 10th IEEE International Conference on Big Data and Cloud Computing, 13th IEEE International Symposium on Social Computing and Networking and 10th IEEE International Conference on Sustainable Computing and Communications, ISPA-BDCloud-SocialCom-SustainCom 2020
Country/TerritoryUnited Kingdom
CityVirtual, Exeter
Period12/17/2012/19/20

All Science Journal Classification (ASJC) codes

  • Hardware and Architecture
  • Renewable Energy, Sustainability and the Environment
  • Computational Mathematics
  • Social Sciences (miscellaneous)
  • Communication
  • Artificial Intelligence
  • Computer Networks and Communications
  • Computer Science Applications

Keywords

  • RoI determination
  • Road detection
  • Traffic video analytics

Fingerprint

Dive into the research topics of 'Automatic road detection in traffic videos'. Together they form a unique fingerprint.

Cite this