Download PDF
Perspective  |  Open Access  |  20 Feb 2024

Coupled mechanics in skin-interfaced electronics via computer vision methods

Views: 417 |  Downloads: 64 |  Cited:   0
Soft Sci 2024;4:12.
10.20517/ss.2023.50 |  © The Author(s) 2024.
Author Information
Article Notes
Cite This Article


Recent advancements in materials and mechanics have paved the way for transforming rigid circuits into flexible electronics. Their ability to laminate onto the skin has led to the development of skin-interfaced electronics, including mechano-acoustic sensors and haptic systems. However, the challenges of the coupled mechanics between the skin and skin-interfaced electronics call for further understanding of biomechanics, bioelectronics, and their interactions. This perspective article highlights the emerging trend of employing computer vision methods to optimize the next generation of skin-interfaced electronics by characterizing associated biomechanics and vice versa. The cyclic research process involves the development of soft electronics, the identification of coupled mechanics, and their quantification using computer vision methods. The article describes state-of-the-art computer vision techniques in the context of skin-interfaced electronics and their potential applications in other forms of soft electronics.


Soft electronics, skin-interfaced electronics, mechano-acoustic sensors, haptic interfaces, computer vision


Advanced materials and associated mechanics have opened pathways for transforming conventional, rigid, planar integrated circuits into stretchable, flexible, soft electronics[1]. Theoretical studies, supported by mechanical modeling, such as finite element analysis (FEA), have revealed key features and characteristics of these unusual material structures, including the optimization of three-dimensional (3D) buckled electronics[2]. The unique properties of these materials, which allow for conformal contact with the body, have led to numerous pioneering works on new classes of soft devices in biomedical applications known as bioelectronics. Skin-interfaced electronics, a specific class of soft electronics, have established the foundations for continuous clinical-grade monitoring and are well investigated and summarized in terms of materials selection, design, fabrication, and system integration[3-5]. Nevertheless, the interaction between biosystems and soft electronics often results in non-trivial coupled mechanics, which requires further effort to characterize the biomechanics, bioelectronics, and their interactions.

This article underscores the latest advances in computer vision techniques and their impact on advancing soft-electronic systems, intending to refine the next generation of skin-interfaced electronics through a thorough characterization of associated biomechanics and, conversely, how these biomechanics influence electronic design. The process is iterative, encompassing the development of soft electronics, the identification of coupled mechanics, and their quantification using computer vision methods, as depicted in Figure 1. We delineate (i) pioneering computer vision techniques employed in skin-interfaced electronics; (ii) the interaction of mechanics in mechano-acoustic (MA) sensors; and (iii) the interconnected mechanics in haptic systems. Final remarks outline expected advancements in computer vision techniques and their projected applications across diverse areas within the soft electronics field.

Coupled mechanics in skin-interfaced electronics via computer vision methods

Figure 1. Schematics showing the research diagram contributing to the identification of coupled mechanics[8]. Copyright©2022, Nature Publishing Group, quantification via computer vision methods[10]. Copyright©2022, National Academy of Science, and development of soft electronics[10]. Copyright©2022, National Academy of Science.


Computer vision methods, also referred to as optical measurement systems, are employed across various research areas within continuum mechanics, including studies on fluid flows[6], solid deformations[7], and wave phenomena[8]. These approaches provide non-contact, non-intrusive measurements with high spatial and temporal resolutions. Recently, they have been instrumental in offering robust mechanical insights for soft electronic devices, particularly in biomedical and biomechanical applications, revealing essential coupled mechanics between biological systems and soft electronic devices. The most representative computer vision techniques applied in soft electronics are summarized in Table 1. When employing one or a combination of these techniques, several factors must be considered, namely, applications, key outputs, the need for fiducial points, the suitable frame of reference, dimensionality, processing time, and resolution.

Table 1

Current state-of-art computer vision techniques used in soft electronics

CategoryApplicationsKey outputsFiducial pointsFrame of referenceDimensionalityProcessing timeResolution*E.g., in soft electronicsRef.
Particle image velocimetryFluid flowVelocity vorticityYesEulerian2DLongHighRespiratory sensors[10]
Particle tracking velocimetryFluid flow object trackingVelocity trajectoryYesLagrangian2D, 3D (Volumetric)Intermediate (2D), Long (3D)High (2D), medium (3D)Drug delivery systems[23]
Digital image correlationSolid deformationDeformation strainYesEulerian lagrangian2D, 3D (Stereoscopic)Short (2D), intermediate (3D)High (2D), high (3D)Haptic systems strain sensors[25]
Eulerian video magnificationVisual enhancementSpatiotemporally amplified videoNoN/A2DShortQualitativeCardiac sensors[18]
Structure from motion3D geometry estimation3D volumetric imageNoN/A3D (Volumetric)IntermediateQualitative3D buckled electronics[19]
Area trackingSolid deformationDeformationNoLagrangian2DIntermediateLow to mediumOrganoid sensors[20]
Markerless pose estimationObject trackingTrajectoryNoLagrangian2D, 3D (Volumetric)Intermediate (2D), long (3D)Low (2D, 3D)Biomechanical sensors[17]

Particle image velocimetry (PIV) is an advanced optical measurement technique that allows for the detailed analysis of the flow velocity field by tracking the collective motion of tracer particles within a fluid[9]. The method operates by observing these particles from an Eulerian reference frame. Essential to the PIV setup is introducing seeding particles into the flow, chosen for their ability to follow the fluid's motion with minimal impact - a property quantified by the Stokes number. A typical PIV system includes one or more high-resolution cameras synchronized with a laser illumination source. This synchronization is critical for capturing the scattered light from particles at precise intervals, particularly in a dual-pulse arrangement where two images are taken in rapid succession. These images are then dissected into smaller interrogation regions. Within each subregion, the displacement of particle groups is determined by employing spatial cross-correlation, providing a vector map of flow velocity and patterns essential for the resolution of complex turbulent flows. The technique is particularly useful for correlating cardiac or respiratory flows to sensors[10].

Digital image correlation (DIC) is related to PIV in terms of the processing scheme, but it is used to quantify deformations on solid surfaces[11]. Instead of seeding particles, DIC utilizes a random speckle pattern and specializes in characterizing spatial derivatives such as deformation and strain. The technique does not require a high-power illumination source or high-speed imaging, making it suitable for standard applications with relatively fast processing times. DIC can be easily combined with 3D reconstruction techniques to achieve 3D stereoscopic deformations on a curved surface, such as the neck and wrist, where the associated deformations can be highly three-dimensional[12]. Applications in soft electronics include validating haptic and strain-related sensors[8,13].

Particle tracking velocimetry (PTV) is particularly useful for characterizing fluid flows and structure motions by tracking particles and fiducial points in the Lagrangian frame of reference[14]. Unlike PIV and DIC, PTV tracks particles in the global coordinate, allowing for estimating particle trajectories. With a multicamera setup, PTV can measure in 3D domains. It offers advantages for quantifying flows in drug delivery systems[15] or validating motion-tracking devices[16]. A relatively recent technique, the Markerless Pose Estimation (MPE), tracks a limited number of objects without fiducial points based on transfer learning with deep neural networks. The technique is ideal for monitoring animal behaviors, such as tracking multiple body parts of a mouse during an odor-guided navigation task and a fruit fly behaving in a 3D chamber[17].

Eulerian Video Magnification (EVM)[18], Structure from Motion (SfM)[19], and area tracking[20] represent advanced computer vision techniques that have substantial implications for soft electronics. These methods are particularly advantageous because they circumvent the need for fiducial points and can support approaches for various soft electronic concepts.

The combination of the above-mentioned computer vision methods could be implemented in skin-interfaced electronics concerning their functionalities and applications. For instance, PIV could be used to correlate respiratory flow with signals from the MA sensor [Figure 2A]. PTV would be useful for quantifying biomarkers requiring high temporal resolution, including vibrations of the neck during speech and respiratory activities, to address design and placement strategies of a wearable sensor [Figure 2B]. DIC can quantify mechanics with high spatial resolution, such as strains on the skin induced by haptic actuators or deformations on the neck during swallowing [Figure 2C]. EVM can enhance subtle motions either generated by a weak haptic actuator [Supplementary Video 1] or pulse wave velocity for developing a wearable pulse oximeter. Area tracking and MPE could be implemented to measure the deformation of organs or implantable devices where adding fiducial markers is difficult. The following sections discuss two sets of examples of computer vision methods used in skin-interfaced electronics: one involves measuring various biomarkers to validate a series of MA sensors, and the other involves quantifying the mechanics induced by various vibrotactile actuators in haptic devices.

Coupled mechanics in skin-interfaced electronics via computer vision methods

Figure 2. Schematic showing computer vision experimental setup for optimizing design and manufacturing strategies for skin-interfaced electronics: (A) flow measurements during respiratory activities using PIV[10]. Copyright©2021, National Academy of Science; (B) biomechanics on the neck during cardiopulmonary activities using 3D PTV[24]. Copyright©2021, American Association for the Advancement of Science; (C) biomechanics on the neck during swallowing activities using 3D DIC[25]. Copyright©2022, Nature Publishing Group. PIV: Particle image velocimetry; PTV: particle tracking velocimetry; 3D: three-dimensional; DIC: digital image correlation.


Skin-interfaced sensor technologies offer a vast range of multimodal, clinical- and consumer-grade, continuous monitoring of physiological biomarkers with high accuracy and immunity to external noises in hospital and in-home settings. The MA device, a thin, soft sensor with a high-bandwidth accelerometer conformally coupled to the skin, has demonstrated its effectiveness in providing precise measurements of MA signals from subtle vibrations of the skin (~10-3 m/s2) to large motions of the entire body (~10 m/s2)[21]. When interfaced at unique anatomical locations, such as the suprasternal notch (SN) at the base of the neck, this technology offers a rich blend of MA information related to various classes of underlying body and physiological processes[22]. In conjunction with the coupled biomechanics through computer vision methods, advanced versions of the MA system have been developed to monitor bio-signals tailored for infectious diseases [Figure 3A1-3][23], cardiopulmonary activities during significant body motions [Figure 3B1-3][24], swallowing patterns for dysphasia patients [Figure 3C1-3][25], and speech sounds [Figure 3D1-3][10].

Coupled mechanics in skin-interfaced electronics via computer vision methods

Figure 3. Examples of Coupled Mechanics in Mechano-acoustic Sensors. (A1-3) Covid sensor correlated with droplet generation during respiratory activities using PTV[23]. Copyright©2021, National Academy of Science; (B1-3) dual sensor focusing on the temporal variations of the neck for cardiopulmonary applications using 3D-PTV[24]. Copyright©2021, American Association for the Advancement of Science; (C1-3) swallowing sensor focusing on the spatial variations of the neck for dysphasia payments using 3D DIC[25]. Copyright©2022, Nature Publishing Group; (D1-3) speech sensor investigating the flow physics of plosive speech using PIV[10]. Copyright©2022, National Academy of Science. PTV: Particle tracking velocimetry; 3D: three-dimensional; DIC: digital image correlation; PIV: particle image velocimetry.

The automated wireless version of the MA device [Figure 3A1] allows for capturing key respiratory symptoms of the infectious disease COVID-19[23]. PTV has been instrumental in correlating the timing and intensity of respiratory activities from MA sensors with total droplet production and droplet dynamics from PTV experiments. In Figure 3A2, PTV measurements during coughing illustrate detected particles as gray circular symbols and grid-interpolated horizontal velocity of droplets as a colored contour. Figure 3A3 provides a sequence from the MA sensor marked by the automated algorithm (top) and the total number of droplet production determined through PTV analysis in sync with the intensity of MA signals (bottom) during coughing.

A later version of an MA sensor[24] was developed to address data corruption resulting from motion artifacts [Figure 3B1], particularly associated with the mechanical characteristics of cardiopulmonary processes. PTV, combined with the 3D reconstruction technique SfM, has enabled the reconstruction of biomechanics in the neck with high temporal accuracy in three dimensions. Furthermore, 3D-PTV has been used to identify and validate the design strategy for canceling signals through the differential operation of time-synchronized dual accelerometers between SN and the sternal manubrium (SM). Additionally, 3D displacement and vector contour fields during cardiac activity based on Delaunay triangulation are illustrated in Figure 3B2. Figure 3B3 shows a time series of z-axis displacement at SN and SM over several cardiac cycles during a breath hold. Peak displacements at the SN are significantly larger than those at the SM, while both capture displacements associated with body motion, enabling efficient subtraction.

A variant of the modified MA sensor [Figure 3C1][25] was introduced to assist therapeutic treatments for patients with dysphagia by tracking swallows. In this context, 3D-DIC has effectively quantified rapid and overlapping biomechanics in the neck induced by the swallowing mechanism in the oral cavity, pharynx, and esophagus, which can vary widely with age, gender, and other factors[26]. Additionally, 3D-DIC provides displacement fields and their temporal derivatives during swallowing motions with high spatial accuracy in the spatiotemporal domain. Figure 3C2 shows a representative 3D displacement map at the beginning of the esophageal phase, indicating that the most dominant signature of swallowing mechanisms occurs in laryngeal prominence (LP), followed by the SN. Figure 3C3 demonstrates the displacement and velocity profiles at LP and SN after a differential operation with the signals at SM, further supporting their fundamental swallowing mechanics. The results justify the development of an elongated version of the device that allows positioning it on the LP and SN or SN and SM for monitoring swallows across a broad range of individuals.

Lastly, an example introduces a version of a dual-in-plane MA device [Figure 3D1] designed to detect plosive speech patterns that generate a significant number of aerosols and droplets with intense turbulent background flows[10]. PIV was used to correlate the background flow physics with MA signals. Figure 3D2 illustrates the formation of a vortex ring from plosive sounds with the velocity vector field and vorticity color contour. Figure 3D3 demonstrates the overall correlation between MA signals, droplet travel distance induced by background flows, and associated biomechanics in the neck with respect to sound power. The studies suggest that sternocleidomastoid (SCM), besides SN, is an ideal location for capturing plosive speech sounds and decoupling them from other bio-signals. MA signals from these locations, using a Dual-in-Plane (DiP) sensor, enhance machine learning-based classification of plosive sounds in various languages.


On the other side of the spectrum, skin-integrated haptic technologies offer the capability to replicate sensations in virtual reality (VR) and augmented reality (AR) settings. Similar to MA sensor technologies, haptic interfaces need to softly laminate onto the curved skin surfaces to deliver sensory information through programmable patterns of localized mechanical vibrations in the spatiotemporal domain[27]. Vibrations of the skin and the operating mechanisms of the actuator are closely linked with complexity due to (i) viscoelastic properties of the skin altering the characteristics of vibrotactile actuators by adding mechanical impedances and (ii) the vibrating direction and boundary conditions of vibrotactile sensors affecting the dynamics of wave propagation in the skin and vice versa. Therefore, a quantitative description of the mechanical effects is essential for designing such systems and understanding the fundamental aspects of resulting tactile sensations.

In recent studies on skin-integrated haptic systems, the investigation of the coupled mechanics between the skin and haptic actuators using computer vision methods has opened up the possibility of further optimizing device designs and gaining a fundamental understanding of our sensory perceptions[8,28]. Complex patterns of the skin deformation induced by an array of eccentric rotating mass (ERM) actuators in a wireless haptic interface were quantified to investigate crosstalk in the system and the performance of the actuators in terms of the activation of mechanoreceptors such as Meissner and Pacinian corpuscles [Figure 4A1-3][8]. Strain distributions on skin phantoms induced by three representative vibrotactile actuators, including ERMs, linear resonant actuators (LRAs), and vibrotactile linear actuators (e.g., tactors), were explored and correlated with relevant perception studies to provide a fundamental understanding of sensory perceptions [Figure 4B1-3][28].

Coupled mechanics in skin-interfaced electronics via computer vision methods

Figure 4. Examples of Coupled Mechanics in Haptic Systems. (A1-3) Haptic Interfaces across Large Areas of the skin optimized by the investigation of deformation and wave propagation of the actuator using the combination of 3D-DIC and PTV[8]. Copyright©2022, Nature Publishing Group; (B1-3) Strain analyses of representative types of vibrotactile actuators on the phantom skin using 3D-DIC and Triangular Cosserat Point Element method[28]. Copyright©2023, Elsevier. 3D: Three-dimensional; DIC: digital image correlation; PTV: particle tracking velocimetry.

A wireless, lightweight, flexible haptic interface [Figure 4A1] was developed to deliver spatiotemporal patterns of touch across large areas of the skin, controlled through smart devices in real-time[8]. PTV and 3D-DIC were used to measure the wave propagation and 3D deformations resulting from structural interactions between the actuators and the surrounding skin. The 3D colored contour map in Figure 4A2 illustrates the deformation in the out-of-plane direction at a representative instant during the actuator’s vibration using 3D-DIC. Figure 4A3 displays the lateral component of wave propagation along the skin induced by the operation of an actuator using PTV. The results reveal that the mechanism of an ERM actuator shares some characteristics of Euler’s disk, and its induced wave propagation along the skin exhibits Rayleigh waves. These mechanical findings further validate the design strategy of devices, showing negligible crosstalk and delivering a strong, power-efficient perception.

Recently, the coupled mechanics of three representative types of vibro-tactile sensors, ERM, LRA, and tactor actuators, were investigated and compared to evaluate their performance in delivering sensory perceptions using 3D-DIC and the Triangular Cosserat Point Elements (TCPE) method for strain estimations[28]. ERM actuators use a direct current (DC) motor, typically rotating laterally, with an off-center mass connected to the output shaft within a closed metal housing, whereas LRA and tactor actuators operate on alternating current (AC) based on voice coils and magnets suspended on springs, typically vibrating vertically [Figure 4B1]. Colored contours in Figure 4B2 and graphs in Figure 4B3 display strain distributions and center line profiles induced by an ERM actuator and tactor at different depths of the skin phantom, h1 and h2, with various contact areas L/L0, indicating their distinctive characteristics. The strain results induced by the ERM actuator exhibit high magnitudes around the actuator’s border near the surface of the phantom skin, with substantial in-plane components. In contrast, the peak strain induced by the tactor occurs at the actuator’s center in-depth, oriented out of the plane of the skin. A combination of various types of actuators, along with the studies of associated coupled mechanics, may be necessary to achieve robust and sophisticated haptic sensations.


In conclusion, we have summarized the recent developments in computer vision methods and their applications in skin-interfaced electronics. These methods are not limited to skin-interfaced electronics but extend to other forms of soft or 3D electronics, including 3D passive microflying systems[29,30], programmable surfaces[23,31], and implantable devices[13,15]. To fully harness the potential of these technologies, a comprehensive grasp of both the manufacturing intricacies of flexible electronics and the data processing schemes using computer vision methods is critical. Despite recent progress, several challenges persist in fully integrating computer vision technologies into soft electronics. For instance, volumetric measurements, such as 3D PTV or Digital Volumetric Correlation (DVC), still lack sufficient data resolution or require a large number of cameras, creating a bottleneck in quantification. Another challenge arises in the experimental setup, requiring a controlled optical environment, especially for simultaneous measurements using multiple techniques. This controlled environment may not be adequate for validating and quantifying the associated mechanics of soft electronic devices during situations where optical access is limited, such as in-vivo testing. However, these limitations address future opportunities. For example, image processing algorithms, such as “Shake-The-Box”[32] and deep learning for PIV/PTV[33] and DIC[34], could be integrated to improve data resolution. Instead of relying on existing computer vision techniques, new methods tailored specifically for optimizing soft electronics could be developed. For instance, by combining infrared imaging techniques, thermal-mechanical behaviors associated with wearable devices could be investigated to address all possible safety hazards[35]. Future research could involve developing (i) a novel haptic system targeted for multisensory engagement based on the measured coupled mechanics using PTV and DIC; (ii) a multimodal skin-integrated sensor for cardiac monitoring validated by EVM; (iii) a wearable motion sensor for orthopedic applications, calibrated with MPE; and (iv) soft tunable electronic camera systems embedded with computer vision processing capabilities. This transdisciplinary research, combining soft electronic technologies and computer vision measurements with a fundamental understanding of continuum mechanics, is key to realizing the next generation of soft electronic systems.


Authors’ contributions

Conceived the ideas, and wrote the manuscript: Kim JT, Chamorro LP

Availability of data and materials

Not applicable.

Financial support and sponsorship

The work was supported by the Querrey-Simpson Institute for Bioelectronics at Northwestern University and the Department of Mechanical Engineering at Pohang University of Science and Technology as part of the start-up package of Kim JT.

Conflicts of interest

Both authors declared that there are no conflicts of interest.

Ethical approval and consent to participate

Not applicable.

Consent for publication

Not applicable.


© The Author(s) 2024.

Supplementary Materials


1. Rogers JA, Someya T, Huang Y. Materials and mechanics for stretchable electronics. Science 2010;327:1603-7.

2. Fu H, Nan K, Bai W, et al. Morphable 3D mesostructures and microelectronic devices by multistable buckling mechanics. Nat Mater 2018;17:268-76.

3. Zhu Y, Li J, Kim J, et al. Skin-interfaced electronics: a promising and intelligent paradigm for personalized healthcare. Biomaterials 2023;296:122075.

4. Li J, Zhao J, Rogers JA. Materials and designs for power supply systems in skin-interfaced electronics. Acc Chem Res 2019;52:53-62.

5. Xu C, Yang Y, Gao W. Skin-interfaced sensors in digital medicine: from materials to applications. Matter 2020;2:1414-45.

6. Kim JT, Kim Y, Kang S, Nam J, Lee C, Chamorro LP. Effect of the aspect ratio on the dynamics of air bubbles within Rayleigh-Bénard convection. Phys Fluids 2021;33:095104.

7. Anderson PSL, Crofts SB, Kim JT, Chamorro LP. Taking a stab at quantifying the energetics of biological puncture. Integr Comp Biol 2019;59:1586-96.

8. Jung YH, Yoo JY, Vázquez-Guardado A, et al. A wireless haptic interface for programmable patterns of touch across large areas of the skin. Nat Electron 2022;5:374-85.

9. Adrian RJ. Particle-imaging techniques for experimental fluid mechanics. Annu Rev Fluid Mech 1991;23:261-304.

10. Kim JT, Ouyang W, Hwang H, et al. Dynamics of plosive consonants via imaging, computations, and soft electronics. Proc Natl Acad Sci U S A 2022;119:e2214164119.

11. Pan B. Recent progress in digital image correlation. Exp Mech 2011;51:1223-35.

12. Solav D, Moerman KM, Jaeger AM, Genovese K, Herr HM. MultiDIC: an open-source toolbox for multi-view 3D digital image correlation. IEEE Access 2018;6:30520-35.

13. Yang Q, Hu Z, Seo MH, et al. High-speed, scanned laser structuring of multi-layered eco/bioresorbable materials for advanced electronic systems. Nat Commun 2022;13:6518.

14. Kim JT, Kim D, Liberzon A, Chamorro LP. Three-dimensional particle tracking velocimetry for turbulence applications: case of a jet flow. J Vis Exp 2016:53745.

15. Wu Y, Wu M, Vázquez-Guardado A, et al. Wireless multi-lateral optofluidic microsystems for real-time programmable optogenetics and photopharmacology. Nat Commun 2022;13:5571.

16. Jeong H, Kwak SS, Sohn S, et al. Miniaturized wireless, skin-integrated sensor networks for quantifying full-body movement behaviors and vital signs in infants. Proc Natl Acad Sci U S A 2021;118:e2104925118.

17. Mathis A, Mamidanna P, Cury KM, et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat Neurosci 2018;21:1281-9.

18. Wu HY, Rubinstein M, Shih E, Guttag J, Durand F, Freeman W. Eulerian video magnification for revealing subtle changes in the world. ACM Trans Graph 2012;31:1-8.

19. Schӧnberger LL, Frahm JM. Structure-from-motion revisited. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2016. pp. 4104-13. Available from: [Last accessed on 15 Jan 2024]

20. Seo Y, Ishizu T, Enomoto Y, Sugimori H, Aonuma K. Endocardial surface area tracking for assessment of regional LV wall deformation with 3D speckle tracking imaging. JACC Cardiovasc Imaging 2011;4:358-65.

21. Lee K, Ni X, Lee JY, et al. Mechano-acoustic sensing of physiological processes and body motions via a soft wireless device placed at the suprasternal notch. Nat Biomed Eng 2020;4:148-58.

22. Jeong H, Rogers JA, Xu S. Continuous on-body sensing for the COVID-19 pandemic: gaps and opportunities. Sci Adv 2020;6:eabd4794.

23. Ni X, Ouyang W, Jeong H, et al. Automated, multiparametric monitoring of respiratory biomarkers and vital signs in clinical and home settings for COVID-19 patients. Proc Natl Acad Sci U S A 2021;118:e2026610118.

24. Jeong H, Lee JY, Lee K, et al. Differential cardiopulmonary monitoring system for artifact-canceled physiological tracking of athletes, workers, and COVID-19 patients. Sci Adv 2021;7:eabg3092.

25. Kang YJ, Arafa HM, Yoo JY, et al. Soft skin-interfaced mechano-acoustic sensors for real-time monitoring and patient feedback on respiratory and swallowing biomechanics. NPJ Digit Med 2022;5:147.

26. Hiss SG, Treole K, Stuart A. Effects of age, gender, bolus volume, and trial on swallowing apnea duration and swallow/respiratory phase relationships of normal adults. Dysphagia 2001;16:128-35.

27. Yu X, Xie Z, Yu Y, et al. Skin-integrated wireless haptic interfaces for virtual and augmented reality. Nature 2019;575:473-9.

28. Kim JT, Shin HS, Yoo JY, et al. Mechanics of vibrotactile sensors for applications in skin-interfaced haptic systems. Extreme Mech Lett 2023;58:101940.

29. Kim BH, Li K, Kim JT, et al. Three-dimensional electronic microfliers inspired by wind-dispersed seeds. Nature 2021;597:503-10.

30. Yoon HJ, Lee G, Kim JT, et al. Biodegradable, three-dimensional colorimetric fliers for environmental monitoring. Sci Adv 2022;8:eade3201.

31. Ni X, Luan H, Kim JT, et al. Soft shape-programmable surfaces by fast electromagnetic actuation of liquid metal networks. Nat Commun 2022;13:5576.

32. Schanz D, Gesemann S, Schröder A. Shake-the-box: lagrangian particle tracking at high particle image densities. Exp Fluids 2016;57:70.

33. Wang H, Liu Y, Wang S. Dense velocity reconstruction from particle image velocimetry/particle tracking velocimetry using a physics-informed neural network. Phys Fluids 2022;34:017116.

34. Boukhtache S, Abdelouahab K, Berry F, Blaysat B, Grédiac M, Sur F. When deep learning meets digital image correlation. Opt Lasers Eng 2021;136:106308.

35. Liu C, Kim JT, Yang DS, et al. Multifunctional materials strategies for enhanced safety of wireless, skin-interfaced bioelectronic devices (Adv. Funct. Mater. 34/2023). Adv Funct Mater 2023;33:2370203.

Cite This Article

Export citation file: BibTeX | RIS

OAE Style

Kim JT, Chamorro LP. Coupled mechanics in skin-interfaced electronics via computer vision methods. Soft Sci 2024;4:12.

AMA Style

Kim JT, Chamorro LP. Coupled mechanics in skin-interfaced electronics via computer vision methods. Soft Science. 2024; 4(2): 12.

Chicago/Turabian Style

Kim, Jin-Tae, Leonardo P. Chamorro. 2024. "Coupled mechanics in skin-interfaced electronics via computer vision methods" Soft Science. 4, no.2: 12.

ACS Style

Kim, J.T.; Chamorro LP. Coupled mechanics in skin-interfaced electronics via computer vision methods. Soft. Sci. 2024, 4, 12.

About This Article

Special Issue

© The Author(s) 2024. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License (, which permits unrestricted use, sharing, adaptation, distribution and reproduction in any medium or format, for any purpose, even commercially, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Data & Comments




Comments must be written in English. Spam, offensive content, impersonation, and private information will not be permitted. If any comment is reported and identified as inappropriate content by OAE staff, the comment will be removed without notice. If you have any queries or need any help, please contact us at

Download PDF
Cite This Article 9 clicks
Like This Article 2 likes
Share This Article
Scan the QR code for reading!
See Updates
Soft Science
ISSN 2769-5441 (Online)
Follow Us


All published articles are preserved here permanently:


All published articles are preserved here permanently: