Real-Time Egg Detection Using Edge Computer Vision

Our work on real-time egg tracking, which involves both egg detection, stable counting and grading will be presented at the First International Conference on Accessible Digital Agriculture Technologies (CADAT 2024) in Nov. 19 in Valencia, Spain.

Real-Time Egg Detection Using Edge Computer Vision

Our work on real-time egg tracking, which involves both egg detection, stable counting and grading will be presented at the First International Conference on Accessible Digital Agriculture Technologies (CADAT 2024) on the 19th of November 2024 in Valencia, Spain.

The adoption of Artificial Intelligence (AI) in agriculture and animal husbandry has accelerated in recent years, driven by the versatility and relatively low costs for development and deployment of smart systems. However, many farms still rely on aging equipment and manual labour rendering these innovations inapplicable. In turn, the inability to harness AI and modernise operations may pose an existential risk. To address this challenge, we advocate for retrofitting existing machinery with AI-based modules as a practical alternative.

In this paper, we demonstrate how a poultry egg grading machine can be enhanced with smart capabilities through the integration of deep learning and low-cost commodity edge hardware to enable precise eggcounting. We present the methodology and algorithms behind this system that enables real-time processing while maintaining high accuracy. In a limited set of experiments, we demonstrated that the Raspberry Pi 5 (RPi5) running the EfficientDet-lite0 model performed just as well as a desktop with an NVIDIA GPU (graphics processing unit), accurately counting all the eggs it was presented with.

N. Hadjisavvas, N. Nicolaou, and E. Stavrakis. “Real-Time Egg Detection Using Edge Computer Vision”. In: Proceedings of the First International Conference on Accessible Digital Agriculture Technologies (CADAT2024). Conference dates: November 17–21, 2024. Valencia, Spain: IARIA, Nov. 2024, pp. 16–22.

The paper can be downloaded at ThinkMind.