An open database of 3D scans of the human head, ear, and torso

    Project Details

    Description

    The aim of this project is to generate a database of high-resolution 3D scans of the head and torso of humans. The data will be presented on a web-portal where software to view, manipulate and process the data is also available. The intended users of the data and tools are students and researches working with acoustical modelling. Specifically, the data can for example be used to optimise spatial perception in hearing aids, boom design for headsets, and simulating individual head related transfer functions.

    Mathematical modelling of the sound field surrounding the head is an emerging discipline that has shown promise to alleviate some of the difficulties in for example designing and testing the performance of new hearing aid designs. However, current state-of-the arts methods are mostly based on synthetic data and the results are therefore somewhat misleading. The lack of data is mainly due to the difficulty in acquiring real 3D data of the human head and torso. Especially, the 3D geometry of the human ear is very difficult to obtain using traditional 3D acquisition techniques like CT, MR, and laser scanning. Recently, the 3D Laboratory at the school of dentistry at the University of Copenhagen obtained a 3dMD cranial scanner by a donation from the Oticon Foundation. This scanner can be used to capture high quality 3D scans of the head, ear (pinna and part of concha), and torso of humans. The aim of this project is to use the scanner at the 3D laboratory to capture the torso and head geometry of a group of test persons. Furthermore, ear impressions should be taken and scanned so the final and merged data is a precise 3D presentation of torso, head, and the ear canal.
    AcronymOpenHATS
    StatusFinished
    Effective start/end date01/02/200801/03/2011

    Collaborative partners

    Funding

    • Forsk. Private danske - Fonde

    Fingerprint

    Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.