Smartphone User Authentication Using Touch Dynamics in the Big Data Era: Challenges and Opportunities

Lijun Jiang, Weizhi Meng

Research output: Chapter in Book/Report/Conference proceedingBook chapterResearchpeer-review


With the wide adoption of smartphones, touchscreens have become the leading input method on the mobile platform, with more than 78 % of all phones using a touchscreen. Thus, more research studies started focusing on touch dynamics and its applications on user authentication. Generally, touch dynamics can be described as the characteristics of the inputs received from a touchscreen when a user is interacting with a device (e.g., a touchscreen mobile phone). Intuitively, touch dynamics is different from keystroke dynamics in that touch dynamics has more input types such as multi-touch and touch movement. On the other hand, the inputs of press button up and press button down in keystroke dynamics are similar to the actions of touch press up and touch press down (e.g., single-touch) in touch dynamics. Due to its characteristics, touch dynamics received more attention from the literature. In this chapter, we aim to present a review, introducing recent advancement relating to touch dynamics in the literature, and providing insights about its future trends in the big data era.
Original languageEnglish
Title of host publicationBiometric Security and Privacy, Signal Processing for Security Technologies
Publication date2016
ISBN (Print)978-3-319-47300-0
Publication statusPublished - 2016
SeriesBiometric Security and Privacy


  • Engineering
  • Signal, Image and Speech Processing
  • Biometrics
  • Big Data/Analytics
  • User Interfaces and Human Computer Interaction
  • Security Science and Technology
  • Systems and Data Security
  • User authentication
  • Smartphones
  • Touch dynamics
  • Big data
  • Challenges and future trends


Dive into the research topics of 'Smartphone User Authentication Using Touch Dynamics in the Big Data Era: Challenges and Opportunities'. Together they form a unique fingerprint.

Cite this