Vehicle Number Plate Detection and Verification Using YOLO Frameworks
Keywords:Number plate detection, YOLO, Convolutional Neural Network (CNN), Optical Character Recognition (OCR)
Vehicle number plate detection is a process of identifying and localizing license plates in images or videos, which is crucial in different applications such as toll collection, law enforcement, parking management, traffic monitoring, and access control systems. The number plate detection algorithm is a cultivated but imperfect technology. The traditional detection algorithm is easily affected by environmental factors such as light, shadow, and background complexity, resulting in a failure in accuracy and better efficient detection. With the development of deep learning, YOLO (you only look once) is an outstanding image-processing technique that leverages convolutional neural networks (CNNs) to identify and recognize license plates within images and videos. This project concerns the application of two YOLO frameworks (YOLOv3 and YOLOv4) for accurate number plate detection. The system aims to compare the effectiveness of different frameworks by outlining each one's unique advantages and disadvantages by determining the best performance in terms of confidence level and accuracy. The verification is applied to the Malaysian number plates following the regulation of the Malaysian transportation system. It differentiates between the standard and non-standard number plates using the YOLO frameworks. To accomplish the dataset is manually captured for training and evaluation on images and video. By training these frameworks separately, both frameworks can learn the detection of the plate. Based on the desired outcome, accuracy rates of 90–95% or higher must be attained to affirm a good performance for YOLOv3 and YOLOv4, which perform significantly better than the conventional detection techniques. Overall, the project will continue to develop the YOLO frameworks for providing recognition performance metrics using the classification results.
How to Cite
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.