Automated grading of wood-slabs. The development of a prototype system

    Research output: Contribution to journalJournal articleResearchpeer-review

    Abstract

    This paper proposes a method for automatically grading small beechwood slabs. The method involves two classification steps: the first step detects defects based on local visual texture; the second step utilizes the relative distribution of defects to perform a final grading assessment. At a major Danish plant for manufacture of parquet boards, the quality grading (visual quality) has always been done manually. As it is expected to be both expensive and difficult to recruit sufficient numbers of personnel to do this type of job in the future, it is of great interest to automate the function as much as possible. A vast range of types of defects has to be considered when the grading is done. This and the fact that wood is a 'natural' material means it is not easily described using ordinary vision systems. The proposed method assumes a 3-D feature space which depends on local texture-based measures of 'lightness', 'speckle' and 'dark deviation'. These measures are calculated for each pixel in an image of a slab. The feature space is separated into 12 decision regions corresponding to 12 'defect types'; these 'defects' are labeled as clear wood, wavy grain, split, black knots, ingrown bark, etc. Based on the relative distribution of these detected defects on the surface of a given slab, the slab is further classified into 5 quality grades: prime, standard, flamy, extra flamy and rejects. As a result of this project, a prototype for the computer vision grading system has been built and is being tested on-site.
    Original languageEnglish
    JournalIndustrial Metrology
    Volume2
    Issue number3-4
    Pages (from-to)219-236
    ISSN0921-5956
    Publication statusPublished - 1992

    Fingerprint

    Dive into the research topics of 'Automated grading of wood-slabs. The development of a prototype system'. Together they form a unique fingerprint.

    Cite this