Development and Optimization of tools for High-quality Spatial Single-cell Gene Expression Profiling using High-resolution Spatial Transcriptomics Technology

Mei Li*

*Corresponding author for this work

Research output: Book/ReportPh.D. thesis

53 Downloads (Pure)

Abstract

Spatially resolved technology is widely recognized as a cutting-edge technology in life sciences. It is increasingly utilized in various areas such as organ development, organism growth, tumor heterogeneity and evolution, as well as clinical translational research. Higher spatial resolution typically implies smaller molecular quantities, whereas observing whole tissues requires a larger field-of-view. The substantial progress in fundamental and translational research come with potential requirements in terms of higher resolution (i.e., at single-cell level) and larger field-of-view, as well as the urgent need for tools to analyze the raw data. To address the data analysis challenges posed by large field-of-view and high resolution spatially resolved technology, this doctoral project is based on Stereo-seq, and aims to provide analysis methods and tools for obtaining high-quality spatial single-cell data, thereby facilitating the application of spatially resolved technology.

In the first study, we developed a framework called StereoCell for high-resolution and large field-of-view spatial transcriptomic data analysis. StereoCell provided a comprehensive and systematic platform for generating high-confidence single-cell spatial data, including image stitching, registration, nuclei segmentation, and molecule labeling. By utilizing better performing algorithms during image stitching and molecule labeling, StereoCell reduced stitching error and time, and improved the signal-to-noise ratio of single-cell gene expression data compared to existing methods. These improvements were validated in mouse brain tissue and results confirmed that StereoCell produced highly accurate spatial single-cell gene expression profiles, thus facilitating clustering and cellular annotation within the biological tissue.

With recent advancements in Stereo-seq technology, it is now possible to acquire cell boundary information, such as cell membrane/wall staining images. In the second study, we took advantage of this progress, and updated StereoCell to a new version, STCellbin, which used nuclei staining images as a bridge to procure cell membrane/wall staining images that align with spatial gene expression map. By employing a sophisticated cell segmentation technique, we obtained precise cell boundaries, thereby yielding more reliable profiles of single-cell spatial gene expression. STCellbin was utilized in mouse liver (cell membranes) and Arabidopsis seed (cell walls) datasets. This enhanced capability offered valuable insights into the spatial organization of gene expression within cells, contributing to a deeper understanding of tissue biology.

In the third study, we proposed an efficient and adaptive Gaussian smoothing (EAGS) imputation method for high-resolved spatial transcriptomics. The adaptive two-factor smoothing of EAGS created patterns based on the spatial expression information within single cells, as well as adaptive weights for the smoothing of cells in the same pattern, and then utilized the weights to restore the gene expression profiles. The performance efficiency of EAGS were assessed by using simulated and high-resolved spatial transcriptomic datasets of the mouse brain and olfactory bulb. Compared with other competitive methods, EAGS showed higher clustering accuracy, better biological interpretations, and significantly reduced resource consumption from computational processes.
Original languageEnglish
Place of PublicationKgs. Lyngby, Denmark
PublisherDTU Bioengineering
Number of pages140
Publication statusPublished - 2023

Fingerprint

Dive into the research topics of 'Development and Optimization of tools for High-quality Spatial Single-cell Gene Expression Profiling using High-resolution Spatial Transcriptomics Technology'. Together they form a unique fingerprint.

Cite this