Image Processing for Robotic Control in Life Science Laboratory Automation

Haoying Yu*, Vilhelm Krarup Moller, Rasmus John Normand Frandsen, Marjan Mansourvar

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

In the contemporary age of advanced technology, image processing plays a vital role as an efficient and indispensable tool across industries, enterprises, monitoring systems, and various applications. This study focuses on employing image processing techniques with an Opentrons industrial robot to ensure quality control during production. By leveraging the YOLOv5 model, the robot adeptly recognizes diverse pipette tip types, identifies liquids by their colors, and accurately measures liquid levels to estimate volume percentages within each tip. These detection results then guide a programmable robot, enabling it to perform specific actions on the tips based on predetermined parameters. This paper provides a comprehensive description of the methodology, implementation, and evaluation of our integrated control-robotic system, highlighting its potential to significantly enhance efficiency and precision in laboratory settings.
Original languageEnglish
Title of host publicationProceedings of 2023 5th International Conference on Robotics and Computer Vision (ICRCV)
Number of pages5
PublisherIEEE
Publication date2023
Pages305-309
ISBN (Print)979-8-3503-2636-9
DOIs
Publication statusPublished - 2023
Event2023 5th International Conference on Robotics and Computer Vision - Nanjing, China
Duration: 15 Sept 202317 Sept 2023

Conference

Conference2023 5th International Conference on Robotics and Computer Vision
Country/TerritoryChina
CityNanjing
Period15/09/202317/09/2023

Keywords

  • Image processing
  • Robotics control
  • Machine learning
  • Computer vision
  • Lab ammunition

Fingerprint

Dive into the research topics of 'Image Processing for Robotic Control in Life Science Laboratory Automation'. Together they form a unique fingerprint.

Cite this