Sensometrics: Thurstonian and Statistical Models

Rune Haubo Bojesen Christensen

    Research output: Book/ReportPh.D. thesisResearch

    1499 Downloads (Pure)


    This thesis is concerned with the development and bridging of Thurstonian and statistical models for sensory discrimination testing as applied in the scientific discipline of sensometrics. In sensory discrimination testing sensory differences between products are detected and quantified by the use of human senses. Thurstonian models provide a stochastic model for the data-generating mechanism through a psychophysical model for the cognitive processes and in addition provides an independent measure for quantification of sensory differences.
    In the interest of cost-reduction and health-initiative purposes, much attention is currently given to ingredient substitution. Food and beverage producing companies are consequently applying discrimination testing to control and monitor the sensory properties of evolving products and consumer response to product changes. Discrimination testing is as relevant as ever because it enables more informed decision making in quantifying the degree to which an ingredient substitution is successful and the degree to which the perceptual properties of the product remain unchanged from end user perspectives.
    This thesis contributes to the field of sensometrics in general and sensory discrimination testing in particular in a series of papers by advancing Thurstonian
    models for a range of sensory discrimination protocols in addition to facilitating their application by providing software for fitting these models. The main focus is on identifying Thurstonian models for discrimination methods as versions of well-known statistical models.
    The Thurstonian models for a group of discrimination methods leading to binomial responses are shown to be versions of a statistical class of models known as generalized linear models. Thurstonian models for A-not A with sureness and 2-Alternative Choice (2-AC) protocols have been identified as versions of a class of statistical models known as cumulative link models. A theme throughout the contributions has been the development of likelihood methods for computing improved confidence intervals in a range of discrimination methods including the above mentioned methods as well as the same-dierent test.
    A particular analysis with 2-AC data involves comparison with an identicality norm. For such tests we propose a new test statistic that improves on previously proposed methods of analysis.
    In a contribution to the scientific area of computational statistics, it is described how the Laplace approximation can be implemented on a case-by-case basis for flexible estimation of nonlinear mixed eects models with normally distributed response. The two R packages sensR and ordinal implement and support the methodological developments in the research papers. sensR is a package for sensory discrimination testing with Thurstonian models and ordinal supports analysis of ordinal data with cumulative link (mixed) models. While sensR is closely connected to the sensometrics field, the ordinal package has developed into a generic statistical package applicable to statistical problems far beyond sensometrics. A series of tutorials, user guides and reference manuals accompany these R packages.
    Finally, a number of chapters provide background theory on the development and computation of Thurstonian models for a range of binomial discrimination protocols, the estimation of generalized linear mixed models, cumulative link models and cumulative link mixed models. The relation between the Wald, likelihood and score statistics is expanded upon using the shape of the (profile) likelihood function as common reference.
    Original languageEnglish
    Place of PublicationKgs. Lyngby
    PublisherTechnical University of Denmark
    Number of pages432
    Publication statusPublished - 2012


    Dive into the research topics of 'Sensometrics: Thurstonian and Statistical Models'. Together they form a unique fingerprint.

    Cite this