Terrain Classification for Outdoor Autonomous Robots using 2D Laser Scans. Robot perception for dirt road navigation

Morten Rufus Blas, Søren Riisgaard, Ole Ravn, Nils Axel Andersen, Mogens Blanke, Jens Christian Andersen

    Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

    327 Downloads (Pure)

    Abstract

    Interpreting laser data to allow autonomous robot navigation on paved as well as dirt roads using a fixed angle 2D laser scanner is a daunting task. This paper introduces an algorithm for terrain classification that fuses four distinctly different classifiers: raw height, step size, slope, and roughness. Input is a single 2D laser scan and output is a classification of each laser scan range reading. The range readings are classified as either returning from an obstacle (not traversable) or from traversable ground. Experimental results are shown and discussed from the implementation done with a department developed Medium Mobile Robot and tests conducted in a national park environment
    Original languageEnglish
    Title of host publicationProceedings of the 2nd International Conference on Informatics in Control, Automation and Robotics.
    Publication date2005
    Pages347-351
    ISBN (Print)972-8865-30-9
    Publication statusPublished - 2005
    Event2nd International Conference on Informatics in Control, Automation and Robotics - Barcelona, Spain
    Duration: 14 Sept 200517 Sept 2005
    Conference number: 2

    Conference

    Conference2nd International Conference on Informatics in Control, Automation and Robotics
    Number2
    Country/TerritorySpain
    CityBarcelona
    Period14/09/200517/09/2005

    Keywords

    • Terrain classification
    • Obstacle detection
    • Road following
    • Laser scanner
    • Classifier fusion

    Fingerprint

    Dive into the research topics of 'Terrain Classification for Outdoor Autonomous Robots using 2D Laser Scans. Robot perception for dirt road navigation'. Together they form a unique fingerprint.

    Cite this