Traffic Participants Detection and Classification Using YOLO Neural Network

Authors

  • Fahmida Sultana Mim Faculty, Rabindra Maitree University, Khustia, Bangladesh.
  • S. M. Naimur Rhaman Sayam Faculty, Rabindra Maitree University, Khustia, Bangladesh.
  • Md. Tanvir Amin Faculty, Rabindra Maitree University, Khustia, Bangladesh.

DOI:

https://doi.org/10.5281/zenodo.6844336

Keywords:

Deep Convolutional Neural Networks, traffic participants, YOLOv4, Object detection, Classification

Abstract

One of the most important requirements for the next generation of traffic monitoring systems, autonomous driving technology, and Advanced Driving Assistance Systems (ADAS) is the detection and classification of traffic participants. Although in the areas of object detection and classification research, tremendous progress has been made, we focused on a specific task of detecting and classifying traffic participants from traffic scenarios. In our work, we have chosen a Deep Convolutional Neural Networks-based object detection algorithm – YOLOv4 (You Only Look Once Version 4) to detect and classify traffic participants accurately with fast speed. The main contribution of our work included: firstly, we built a custom image dataset of traffic participants (Car, Bus, Truck, Pedestrian, Traffic light, Traffic sign, Vehicle registration plate, Motorcycle, Ambulance, Bicycle wheel). After that, we run K-means clustering on the dataset to design anchor box, which is utilized to adapt to various small and medium scales. Finally, trained the network for the mentioned objects and tested our network in several driving conditions (daylight, low light, high traffic, foggy, rainy, etc.). We got the results reached a mean Average Precision (mAP) up to 65.95% and the speed was around 0.054 s.

Downloads

Published

2022-07-06

How to Cite

Fahmida Sultana Mim, S. M. Naimur Rhaman Sayam, & Md. Tanvir Amin. (2022). Traffic Participants Detection and Classification Using YOLO Neural Network. LC International Journal of STEM (ISSN: 2708-7123), 3(2), 9-18. https://doi.org/10.5281/zenodo.6844336