Download PDF
Research Article  |  Open Access  |  20 Nov 2025

The Touch-Code Glove: a multimodal mapping interface with triboelectric-digital encoding for intuitive robot training

Views: 33 |  Downloads: 1 |  Cited:  0
Soft Sci. 2025, 5, 59.
10.20517/ss.2025.68 |  © The Author(s) 2025.
Author Information
Article Notes
Cite This Article

Abstract

Current human-robot interaction (HRI) systems for training embodied intelligent robots often suffer from limited motion dimensionality and unintuitive control. This work presents the Touch-Code Glove, a multimodal HRI interface integrating functional materials, structural intelligence, and deep-learning decoding. A triboelectric digital interface is embedded into the Wrist-pad via a mosaic-patterned array of polyamide/polytetrafluoroethylene-doped silicone rubber films, generating polarity-dependent digital signal pairs upon contact. A co-electrode layout enables 16 touch points with minimal wiring, allowing multiplexed, programmable tactile input via sliding or multi-point gestures. Coupled triboelectric signals are accurately decoded using a convolutional neural network and long short-term memory model, achieving over 98% recognition accuracy. Complementarily, a double-network conductive hydrogel composed of sodium alginate, polyacrylamide, and sodium chloride is integrated into the Finger-fibers and the Wrist-pad to provide strain-sensing capabilities with excellent stretchability, high linearity, low hysteresis, and long-term stability. The system incorporates three concurrent sub-mapping strategies: gesture-driven control, wrist posture-based movement, and touch path-guided input, which together enable real-time control of robotic hands and arms without requiring professional training. This triboelectric-hydrogel hybrid interface offers a materials-centric solution for intelligent, wearable, and accessible HRI, paving the way for next-generation multimodal robotic control systems in assistive and industrial applications.

Keywords

Human-robot interface, multimodal sensing, triboelectric encoding, motion mapping, flexible electronics, embodied intelligent robots

INTRODUCTION

Embodied intelligent robots are rapidly transitioning from controlled laboratory settings to real-world deployment[1,2], offering transformative potential across diverse domains[3] such as home service[4,5], hazardous environment operations[6,7], industrial collaboration[8,9], and medical assistance [Figure 1A][10,11]. A central ambition for these robots is to achieve human-level dexterity, enabling them to physically interact with and manipulate their surroundings with the same level of finesse and adaptability as a person[12-14]. To this end, enabling robots to learn directly from human demonstrations has emerged as a powerful and intuitive alternative to traditional programming, allowing them to acquire a diverse range of skills more efficiently. Achieving this capability requires seamless and intuitive human-robot interaction (HRI) interfaces that support natural, human-like demonstrations[15,16].

The Touch-Code Glove: a multimodal mapping interface with triboelectric-digital encoding for intuitive robot training

Figure 1. Proposed Touch-Code Glove based on multimodal sensing and triboelectric digital encoding strategy for training embodied intelligent robots. (A) Schematic illustration of the applications of embodied intelligent robots; (B) Schematic illustration of the three proposed mapping strategies of the Touch-Code Glove; (C) Structure of the Touch-Code Glove; (D) Finger bending perception strategy of the Touch-Code Glove; (E) Wrist bending perception strategy of the Touch-Code Glove; (F) Finger touch perception strategy of the Touch-Code Glove. PA: Polyamide; PTFE: polytetrafluoroethylene.

Various HRI interface solutions have been explored, including data finger cots[17,18], wearable exoskeletons[19,20], and joysticks[21]. However, these approaches often suffer from drawbacks such as limited sensing range, excessive bulk, or lack of intuitive control. In comparison, glove-based systems offer a more balanced solution by combining wearability with precise motion mapping, making them especially well-suited for translating natural hand-arm movements into robotic actions. However, despite advances in wearable sensing, a fundamental trade-off persists between information dimensionality and interactive intuitiveness. On the one hand, most commercial and academic data gloves focus solely on finger flexion detection, which may suffice for controlling robotic hands but fails to capture arm movements, rendering them inadequate for coordinating complex “motion-manipulation” tasks[22-24]. On the other hand, attempts to increase dimensionality often rely on calibration-sensitive inertial measurement units (IMUs)[25] or external vision systems[26] to track wrist or arm posture. These solutions sacrifice wearability, increase system complexity, and limit portability. More critically, current glove interfaces rarely offer an intuitive, programmable input channel for issuing discrete commands, similar to how users interact with smartphones through simple touch gestures. This gap leaves robots with insufficient access to human intent, especially for triggering task-level instructions such as “initiate cleaning” or “switch mode”.

To overcome the limitations of current HRI interfaces in posture tracking and command input, there is a pressing need for a complementary multimodal solution that can continuously capture high-dimensional limb movements (including finger and wrist postures), intuitively receive programmable discrete commands, and remain lightweight, self-contained, and independent of external systems. Flexible electronics, especially electronic skins (E-skins), provide a promising foundation for achieving this solution[27-31]. Thanks to their inherent softness, stretchability, low weight, and design versatility, E-skins are highly suitable for wearable platforms[32-35]. Their diverse sensing mechanisms support the detection of both static and dynamic human motion signals, enabling rich behavioral perception[36,37] and precise motion mapping[38,39]. Beyond this, the integration of pad-based tactile sensors introduces a crucial advancement in expanding the functionality of glove-based interfaces. These sensors can convert touch signals from finger taps into programmable commands, offering an intuitive input modality similar to how users interact with touchscreens. Tactile pads based on triboelectric nanogenerator (TENG) sensing mechanisms are particularly advantageous due to their customizable electrode configurations, reduced wiring complexity, and robust sensing performance under mechanical deformation[40,41]. By serving as discrete control inputs that complement continuous motion tracking, pad-based components enhance the expressiveness, interactivity, and usability of wearable HRI interfaces, paving the way toward more intelligent and autonomous robotic control systems[42-46].

In response to this challenge, we present a highly integrated and multimodal HRI glove, which we refer to as the “Touch-Code Glove”. This glove is not simply a collection of sensors; it represents the outcome of synergistic innovations in materials, structural design, and signal encoding. First, to achieve high-dimensional yet lightweight posture tracking, ionically conductive hydrogel-based sensors were seamlessly integrated into the “Finger-fibers” and the “Wrist-pad”. This unified sensing mechanism allows for the concurrent and continuous capture of both finger bending and wrist flexion/extension, enabling intuitive, linked control of a dexterous hand and robotic arm. Second, we embedded a flexible triboelectric sensing array into the Wrist-pad to enable discrete, programmable user input. Third, and most crucially, we developed a triboelectric-digital encoding strategy. By tailoring the polarity of the triboelectric materials, physical touch events are directly transduced at the hardware level into “1” and “-1” digital signals. This strategy not only simplifies signal processing but transforms complex touch gestures (e.g., sliding, multi-point contact) into a programmable information matrix. Furthermore, an artificial intelligence (AI)-powered algorithm based on a convolutional neural network-long short-term memory (CNN-LSTM) model is employed to decode complex, coupled signals from multi-point touches, effectively converting signal coupling phenomenon from a challenge into a source of higher-order information. The result is a powerful and intuitive interface that allows an untrained operator to seamlessly control three distinct modalities with a single hand: manipulating a dexterous hand through finger posture, adjusting the robotic arm’s elevation using wrist motion, and planning its path or initiating tasks through touch gestures on the Wrist-pad. We demonstrate the system’s efficacy through a complex pick-transport-and-place task, validating its potential to significantly lower the barrier for robot training. This work presents a new paradigm for intelligent and low-barrier HRI interfaces, paving the way for more capable and accessible embodied robots in our daily lives.

EXPERIMENTAL

Materials

Sodium alginate (SA), acrylamide (AM), sodium chloride (NaCl), N, N’-methylene-bisacrylamide (MBAA), n-hexane, benzophenone (BP), and ethanol were purchased from Shanghai Aladdin Biochemical Technology Co., Ltd. Irgacure 2959 was purchased from Shanghai Huayueye Technology Co., Ltd. Silicone rubber was purchased from Boutique Co., Ltd. Dowsil 734 was purchased from Shanghai Songjin Co., Ltd. Polyamide (PA) and polytetrafluoroethylene (PTFE) particulates were purchased from Kexinda Polymer Materials Co., Ltd.

Modification of silicone rubber’s electronegativity

PA (diameter 200 μm) and PTFE particulates (diameter 0.1 μm) were each mixed with silicone rubber prepolymer at a mass ratio of 2:10 and stirred for 2 min. Subsequently, 0.5 mL of n-hexane was added to the mixture, which was then dispersed using a JY92-IIDN ultrasonic cell disruptor at 15% power for 2 min.

Preparation of hydrogel prepolymer solution

(i) SA particles were added to the NaCl solution. The mixture was stirred at room temperature using a magnetic stirrer (SN-MSY-2HD, SUNNE Co., Ltd.) for 8 h to fully swell SA in the NaCl solution; (ii) Manual stirring continued at 50 °C for 0.5 h to dissolve SA completely; (iii) A transparent and uniformly stable NaCl/SA composite solution was obtained and allowed to sit overnight at room temperature; (iv) The AM monomer was added and stirred for 2 h; (v) An ice-water bath was performed on the solution; (vi) Crosslinker MBAA and Irgacure 2959 were added, stirred for 10 min, and then centrifuged for 10 min to obtain a uniformly transparent solution.

Fabrication of Finger-fiber

(i) After evacuating the mixed solutions for 5 min, the silicon rubber solution was drawn using a pipette and dropped into the mold; (ii) The solution was evacuated again for 2 min, and then heated at 80 °C for 20 min; (iii) The products were demolded to obtain the flexible substrate; (iv)-(vi) Using silicone rubber as the prepolymer, steps (i)-(iii) were repeated to fabricate the flexible encapsulation; (vii) Surface hydrophilicity treatment was performed on the flexible substrate: the flexible substrate was placed in an oxygen plasma cleaner (PDC-MG, Mingheng Co., Ltd.) with the channel side facing upward and treated at 250 W for 20 min; (viii) Surface hydrogen bonding treatment was performed on the flexible substrate and flexible encapsulation: the flexible substrate and flexible encapsulation were immersed in BP solution (10 wt% in ethanol) for 2 min; (ix) then dried with nitrogen gas; (x) The hydrogel prepolymer solution was dropped onto the flexible substrate and covered with a flexible encapsulation; (xi) A UV lamp (power 20 W, Gaodeng Co., Ltd.) with a wavelength of 365 nm was used to irradiate for 10 min; (xii) Obtain the Finger-fiber. The detailed schematic of the fabrication process is shown in Supplementary Figure 1.

Fabrication of Wrist-pad

(i)-(viii) followed a process similar to the fabrication of the Finger-fiber, with the main difference being the replacement of the flexible encapsulation with a triboelectric film (T-film). The T-film was fabricated by dropping PA-silicone rubber solution and PTFE-silicone rubber solution into designated areas of the mold, followed by vacuum degassing and thermal curing; (ix) The top and bottom pads were aligned; (x) A support layer was applied along the edges to fix the top and bottom pads together; (xi) An anti-static silicone prepolymer layer was coated along the outermost edge and air-dried to cure, resulting in the formation of the Wrist-pad. The detailed schematic of the fabrication process is shown in Supplementary Figure 2.

Assembly of the Touch-Code glove

The Finger-fiber is placed on the glove surface, with its signal wires routed through reserved wire routing holes and drawn out from the inside. Silicone adhesive is used to fix the Finger-fiber to the glove, ensuring mechanical stability and flexibility. The wrist-pad is arranged on the wrist surface, with its signal wires partly routed through reserved holes and partly drawn directly from the glove surface. All wires are finally bundled with protective tubing for reliable data acquisition. The Wrist-pad is also fixed to the glove using silicone adhesive. This integrated design ensures robust structural integration, stable signal transmission, and comfortable wearability during continuous HRI. The detailed schematic of the assembly process is shown in Supplementary Figure 3.

RESULTS AND DISCUSSION

A multimodal human-robot interface architecture

The Touch-Code Glove achieves the mapping from the human hand and arm to the structure of a robotic arm integrated with a dexterous hand by employing three sub-mapping strategies, as shown in Figure 1B. Mapping I maps the finger bending posture data collected by the Touch-Code Glove to the dexterous hand, enabling it to replicate human hand gestures. Mapping II maps the wrist bending posture data to the joints of the robotic arm, allowing the arm to move up and down in response to the human wrist’s motion. Mapping III encodes the finger touch path data collected by the Touch-Code Glove into programmable commands, which are then mapped to the robotic arm’s joints, enabling the control of joint rotation based on the touch path of the human finger. The finger part of the Touch-Code Glove is equipped with a Finger-fiber, which is responsible for detecting finger bending motions, while the wrist part is equipped with a Wrist-pad, which senses wrist bending movements and the touch path of the fingers [Figure 1C]. A photograph of the Touch-Code Glove is provided in Supplementary Figure 4A.

The capability to track fine finger motions is provided by five custom-fabricated Finger-fibers, one for each digit. Each Finger-fiber is a soft, composite structure containing a highly conductive, double-network hydrogel as its core sensing element [Supplementary Figure 4B]. These Finger-fibers function as robust strain sensors that translate the mechanical bending of finger joints into precise electrical resistance changes [Figure 1D].

The second key component is the multifunctional Wrist-pad, which is responsible for both wrist posture tracking and touch-based command input. This pad uniquely integrates two distinct sensing modalities. The embedded double-network hydrogel functions as a strain sensor for wrist motion, producing positive resistance changes under flexion (ΔR/R0 > 0) and negative changes under extension (ΔR/R0 < 0), thereby generating polarity-specific responses [Figure 1E and Supplementary Figure 5]. Its internal surface is laminated with a triboelectric tactile array, which serves as a programmable interface for gesture-level control. As illustrated in Figure 1F, this array employs a novel digital encoding scheme to translate touch interactions into discrete commands. The structural schematic and layered presentation of the Wrist-pad, as well as its physical photograph, are detailed in Supplementary Figure 4C and D.

Triboelectric-digital encoding for tactile commands

The programmable interaction capabilities of the Touch-Code Glove are primarily enabled by the triboelectric tactile system embedded within its Wrist-pad. This system’s design is centered on our triboelectric-digital encoding strategy. To achieve this, T-films composed of PA- and PTFE-doped silicone rubbers are arranged to form sixteen TENG-based touch points [Figure 2A]. Upon mechanical contact, the touch point converts tactile input into electrical signals via the combined effects of electrostatic induction and the triboelectric effect, as illustrated in Figure 1F(i). The deliberately engineered opposing polarities of these materials cause a touch event to generate a distinct, waveform-encoded signal: a peak-valley shape for “1” and a valley-peak shape for “-1” [Figure 1F(ii) and 2B]. This hardware-level digitization method provides a robust foundation for interpreting user intent, which is then validated through extensive performance characterization [Figure 2C] and path tracking demonstrations [Figure 2D].

The Touch-Code Glove: a multimodal mapping interface with triboelectric-digital encoding for intuitive robot training

Figure 2. Triboelectric-digital encoding strategy of Touch-Code Glove finger touch interactions. (A) Naming convention for touch points; (B) (i) The open-circuit voltage characteristics of touch point under different contact pressures; (ii) The open-circuit voltage waveform of touch point under a contact pressure of 2 kPa; (iii) An enlarged view of the open-circuit voltage waveforms of touch point under a contact pressure of 2 kPa; (iv)-(vi) Schematic of the SEM images of silicone rubber, PTFE-silicone rubber and PA-silicone rubber; (C) (i) Sensitivity of the touch point; (ii) The open-circuit voltage generated by different materials in contact with the touch point surface; (iii) The open-circuit voltage at 0° to 60° bending angle of the touch point; (D) Schematic illustration of the touch path recognition capability of the Wrist-pad. SEM: Scanning electron microscopy; PTFE: polytetrafluoroethylene; PA: polyamide.

The Wrist-pad adopts a 4 × 4 matrix layout, as illustrated in Figure 2A, Supplementary Figure 4C and D. Sixteen sensing nodes (P1-P16) are formed by the intersections of four top-pad T-films connected to vertical channels (C1-C4) and four bottom-pad T-films connected to horizontal channels (C5-C8). Each T-film is paired with a Touch-hydrogel to serve as a co-electrode for triboelectric signal transmission. This cross-point architecture was deliberately chosen for its high wiring efficiency, allowing all sixteen points to be independently addressed with only eight signal lines, which is critical for integration into a compact, wearable platform.

To enable digital encoding at the hardware level, the T-films of the Touch-Code Glove are fabricated from silicone rubber composites doped with either tribopositive PA or tribonegative PTFE particles. The top-pad incorporates PA-silicone rubber, while the bottom-pad uses PTFE-silicone rubber, thereby establishing opposing triboelectric polarities at each cross-point node (touch point). Upon contact and release, this asymmetric configuration generates a pair of mirror-symmetric voltage signals: PA-silicone rubber consistently produces a valley-peak waveform, whereas PTFE-silicone rubber yields a peak-valley waveform, as illustrated in Figure 2B(ii). To evaluate the signal characteristics of the touch point, tests were performed under contact pressures ranging from 0.5 to 7 kPa [Figure 2B(i)]. With increasing pressure, both PA-silicone and PTFE-silicone rubbers showed rising absolute values in peak and valley voltages. The valley of PA-silicone rubber closely matched the peak of PTFE-silicone rubber, and vice versa, indicating minimal charge loss and efficient surface charge transfer. This pressure-dependent voltage enhancement is attributed to increased contact area from the deformation of hemispherical tactile structures. Above 5 kPa, the response plateaued, suggesting saturation of the contact interface. At 2 kPa, the voltage waveforms display clear polarity inversion between the two materials: PA-silicone generates valley-peak signals, while PTFE-silicone produces peak-valley signals, with nearly equal absolute magnitudes. This mirror symmetry confirms reliable charge transfer with minimal signal degradation. Based on these stable waveform patterns, a peak-valley signal is defined as digital “1”, a valley-peak signal as “-1”, and the absence of signal as “0”, forming a robust mechanical-to-digital encoding scheme. Representative waveforms for “1” and “-1” are shown in Figure 2B(iii). The surface morphologies of silicone rubber, PTFE-silicone rubber, and PA-silicone rubber are shown in the scanning electron microscopy (SEM) images in Figure 2B(iv)-(vi).

The touch sensing performance of the touch point was evaluated under varying pressures and contact materials. The PA-silicone rubber peak voltage was used for calibration, showing a sensitivity of 0.29 V/kPa from 0 to 6 kPa, which dropped to 0.01 V/kPa beyond 6 kPa [Figure 2C(i)]. Tests with human skin, wood, and plastic rubber under 1 kPa revealed consistent open-circuit voltages [Figure 2C(ii)], confirming that the bilayer co-electrode structure ensures stable signal output regardless of contact material. To avoid the influence of bending on touch sensing performance, the interaction was explored under different bending angles of 0° to 60°. The results showed a relatively stable open-circuit voltage between the touch point in Figure 2C(iii). A pressure of 1 kPa was applied to the touch point. As the bending angle increased, a slight upward trend was observed in the peak voltage signal of the PA-silicone rubber. This is attributed to the increased resistance of the W-hydrogel element within the Wrist-pad during bending, which leads to a higher open-circuit voltage. However, this minor increase does not affect the tactile sensing performance, thanks to the triboelectric digital encoding strategy.

The Wrist-pad’s digital encoding strategy enables accurate recognition of both dynamic and multi-point touch gestures, as demonstrated in Figure 2D. Sequential swipes across the matrix, including linear movements from P8 to P5 [Figure 2D(i)], diagonal paths from P13 to P4 [Figure 2D(ii)], and curved trajectories from P10 to P11 through P6 and P7 [Figure 2D(iii)], are recorded as distinct temporal sequences of 1 × 8 matrix digital codes. For example, a single finger slides from P8 to P5, sequentially activating P8 (0,0,0,-1,0,1,0,0), P7 (0,0,1,0,0,-1,0,0), P6 (0,-1,0,0,0,1,0,0), and P5 (1,0,0,0,0,-1,0,0), generating distinct open-circuit voltage signals aligned with each touch point. The system also supports parallel touch detection. In Figure 2D(iv), simultaneous contacts on multiple nodes produce composite codes that represent the superimposed response from multiple touch inputs. These capabilities enable a rich, gesture-level control vocabulary comparable to keyboard shortcuts and form the foundation for advanced HRI interfaces. However, as multi-point interactions become denser, signal coupling arises and presents a significant challenge.

Electrostatic simulations using COMSOL Multiphysics software (COMSOL) were conducted to investigate electrostatic induction and potential distribution in coupled touch configurations under varying separation distances. Results confirm localized charge behavior and stable zero-potential zones between triboelectric surfaces [Figure 3A]. For detailed simulation procedures and analysis, see Supplementary Text 1 and Supplementary Figure 6. Regarding COMSOL simulation and parametric analysis of geometric effects on triboelectric performance, see Supplementary Text 2 and Supplementary Figures 7-9, which demonstrate that optimized structural parameters enhance overall device performance. The signal coupling phenomenon, detailed in Figure 3B and C, results in complex, often unpredictable, composite waveforms. Such waveform ambiguity poses a significant challenge to conventional threshold-based detection methods and threatens to corrupt the integrity of our digital encoding scheme. Figure 3B(i) illustrates the triboelectric signal coupling mechanism enabled by the Wrist-pad architecture. When a trainer’s finger simultaneously contacts two neighboring sensing nodes, such as P14 and P15 located at the intersections of co-channel C8 with columns C2 and C3, respectively, distinct triboelectric signals are generated due to the interaction between the corresponding triboelectric material pairs. These signals, captured independently from C2 and C3, are defined as “simple signals” and exhibit opposite polarity (“±1”). The co-channel design of C8 allows for the integration of the inverse signals of these simple signals into a “coupled signal”, which encapsulates higher-order touch information. In Figure 3B(ii), simultaneous touches at P2 and P10 generate two “1” signals on C5 and C7, which combine into a complex waveform “n” on C2.

The Touch-Code Glove: a multimodal mapping interface with triboelectric-digital encoding for intuitive robot training

Figure 3. Coupled triboelectric signal encoding and recognition strategy of the Wrist-pad. (A) Simulated potential distribution of two touch points; (B) (i) Schematic illustration of the Wrist-pad signal coupling mechanism; (ii)-(v) Examples of coupled signal generation under different touch conditions; (C) (i) and (ii) Waveform complexity increases as more simple signals are involved in the coupling process, resulting in unpredictable coupled waveforms; (D) (i) Collection of 20 sample signals for each of the four coupling categories (1-point to 4-points); (ii) Confusion matrix of the CNN-LSTM classification model trained to recognize different coupling cases. CNN-LSTM: Convolutional neural network-long short-term memory; PA: polyamide; PTFE: polytetrafluoroethylene.

Similarly, Figure 3B(iii) shows two “-1” signals from P1 and P9 combining into “n” on C1. In Figure 3B(iv), touches at P1 (“-1”) and P5 (“1”) produce opposite signals on C5 and C6, which partially cancel out, resulting in a weak or unclear signal on C1. Figure 3B(v) presents an asynchronous case, where P2 and P10 are touched sequentially. Despite the time gap, both simple signals are still captured and successfully combined into a valid coupled signal. These cases demonstrate that the coupling process may either enhance or weaken waveform complexity depending on signal phase and polarity. Overall, the Wrist-pad’s coupling design supports both synchronous and asynchronous signal integration, enabling robust and accurate. Figure 3C illustrates how coupled signals are generated by integrating multiple simple signals with different waveform patterns. In Figure 3C(i), the coupled signal consists of two peak-valley (“1”) signals and one valley-peak (“-1”) signal. Figure 3C(ii) shows a more complex case with two “1” and two “-1” signals. As more signals participate, the resulting waveform becomes increasingly complex and unpredictable. This highlights the limitations of direct pattern-based interpretation and underscores the need for data-driven classification methods to reliably distinguish signal types.

To overcome this challenge, we transformed it into an opportunity. Instead of attempting to suppress the coupled signals, we developed an AI-powered strategy to decode the rich information embedded within them. We employed a CNN-LSTM network, a model adept at learning spatio-temporal features from complex time-series data. As demonstrated by the high-accuracy confusion matrix in Figure 3D, the network can robustly classify coupled signals from up to four simultaneous touch points with an average accuracy exceeding 98%. This result validates our approach: by embracing the complexity of signal coupling and leveraging deep learning, we can dramatically enhance the information of the tactile system. The construction method of the CNN-LSTM network is detailed in the Supplementary Text 3, Supplementary Figure 10, and Supplementary Table 1. Additional validation of the CNN-LSTM through dataset scaling and cross-user experiments is presented in Supplementary Text 4, Supplementary Tables 2 and 3, Supplementary Figures 11 and 12, confirming its scalability and robustness across diverse users. Furthermore, the conflict-resolution strategy for multi-point coupling in the Wrist-pad is described in Supplementary Text 5 and Supplementary Figure 13, which further ensures reliable decoding and prevents false triggers.

Hydrogel-based sensing for kinematic posture tracking

To enable high-fidelity mapping of human posture, the Finger-fibers and the resistive element of the Wrist-pad must provide reliable and responsive strain sensing. Our design leverages a custom-developed double-network hydrogel, composed of SA, polyacrylamide (PAM), and NaCl. The crosslinked double-network architecture provides exceptional mechanical robustness and flexibility, while the mobile Na+ and Cl- ions ensure high ionic conductivity. This combination results in a superior strain-sensing material. The sensing mechanism is based on the principle that physical deformation alters the hydrogel’s geometry, leading to a measurable change in ionic resistance. This section characterizes the performance of this mechanism in detail [Figure 4].

The Touch-Code Glove: a multimodal mapping interface with triboelectric-digital encoding for intuitive robot training

Figure 4. Strain sensing performance of the Touch-Code Glove. (A) (i) Relative resistance change rate of the Wrist-hydrogel under wrist flexion and extension; (ii) Bidirectional linear relationship between bending angle and relative resistance change rate; (iii) Hysteresis behavior of the Wrist-pad within a 90° bending range; (iv) SEM image showing the double-network microstructure of the Wrist-hydrogel; (v) Evaluation of signal stability under simultaneous wrist bending and tactile events; (vi) Repeatability test of wrist bending sensing over 10,000 s; (vii) Stepwise signal response under discrete wrist bending angles; (B) (i) Relative resistance change rate of the Finger-fiber under finger bending; (ii) Linear correlation between finger bending angle and relative resistance change rate; (iii) Hysteresis behavior of the Finger-fiber during cyclic deformation; (iv) Dynamic response characteristics during bending and recovery; (v) Multi-finger gesture recognition using the Finger-fibers, showing distinct signal patterns from the thumb, index, middle, ring, and little fingers. SEM: Scanning electron microscopy; DH: dissipated hysteresis.

The Wrist-pad is capable of effectively distinguishing wrist extension and wrist flexion by detecting variations of relative resistance change rate (ΔRWrist/RWrist_0) in response to bending angle (θ) changes, as shown in Figure 4A(i). The ΔRWrist/RWrist_0 of the Wrist-hydrogel increases during wrist extension and decreases during wrist flexion, corresponding to opposite signal trends. The excellent linearity ensures a direct, predictable mapping between the user’s wrist angle and the robot arm’s movement, which is a key requirement for intuitive control. The linearity of the Wrist-pad was validated by measuring the ΔRWrist/RWrist_0 across wrist bending angles from 0° to 90°. The results showed clear linear trends for both extension and flexion movements, with directional sensitivities of 9.90 and -9.83, respectively. The high coefficients of determination (R2 = 0.99 for extension and R2 = 0.98 for flexion) confirm the strong linear correlation between wrist angle and sensor output [Figure 4A(ii)]. The minimal hysteresis guarantees positional accuracy and prevents signal drift during prolonged operation, enhancing the system’s reliability. As shown in Figure 4A(iii), the Wrist-pad demonstrates a low bidirectional hysteresis of only 1.16% within a 90° bending range. This stability is attributed to the double-network hydrogel, which enhances mechanical integrity under strain [Figure 4A(iv)]. To demonstrate the effective decoupling between touch input and wrist bending, which is essential for accurate signal interpretation during dynamic real-world operation, the multimodal sensing reliability of the Wrist-pad was further validated. Touch tests at 1 kPa were performed while bending the wrist to 20°, 40°, and 60° in both directions. Despite the simultaneous deformation, the resistance signals remained stable [Figure 4A(v)]. In addition, the repeatability of the bending sensing was assessed under dynamic loading over 10,000 s, during which the ΔRWrist/RWrist_0 remained stable with negligible offset [Figure 4A(vi)]. Finally, the stepwise signal response in Figure 4A(vii) highlights the Wrist-pad’s capacity to distinguish fine variations in wrist bending angle with high resolution and low signal crosstalk between wrist extension and wrist flexion motions. The analysis of resistance changes in the hydrogel during deformation is provided in Supplementary Text 6. The wrist-pad exhibits an ultralow limit of detection of 0.03°, ensuring reliable recognition of subtle wrist posture variations [Supplementary Text 7 and Supplementary Figure 14].

The Finger-fiber integrated in the Touch-Code Glove enables accurate detection of finger bending by monitoring variations in relative resistance change rate (ΔRFinger/RFinger_0) in response to finger bending angle (β) [Figure 4B(i)]. To evaluate the strain sensing performance of the Finger-fiber for precise finger posture tracking, relative resistance change rates were measured during finger bending from 0° to 90°. As shown in Figure 4B(ii), the ΔRFinger/RFinger_0 increases monotonically with bending angle, exhibiting a strong linear correlation. The Finger-fiber exhibited a sensitivity of 71.45 and a high R2 of 0.99, confirming its accuracy and reliability throughout the entire bending range. Minimal hysteresis, ensures consistent positional output and long-term signal stability, thereby improving the overall reliability of the system. Figure 4B(iii) illustrates the hysteresis performance of the Finger-fiber under cyclic bending. The device exhibits minimal hysteresis (DH = 1.43%) during bending and releasing, indicating stable performance. As a hydrogel-based HRI system, the Touch-Code glove was extensively tested under varying temperature and humidity conditions to evaluate long-term stability and environmental durability, with detailed results provided in Supplementary Texts 8 and 9 and Supplementary Figures 15-20, confirming consistent performance under both standard and harsh environments.

The sub-second response time is critical for real-time robotic grasping, eliminating perceptible lag between the user’s action and the robot’s reaction. The dynamic response behavior is shown in Figure 4B(iv), where a sharp resistance change occurs within 0.39 s during bending and recovers within 0.47 s upon release, reflecting a fast response. To demonstrate the Finger-fiber’s practical applicability, Figure 4B(v) shows a sequence of nine distinct hand gestures performed while wearing the Touch-Code Glove. Each finger, including the thumb, index, middle, ring, and little fingers, generates distinguishable resistance signals, enabling precise multi-finger gesture recognition. This validates the Finger-fiber’s capability for high-resolution, real-time finger motion detection. The comparison between the Touch-Code Glove and recent advances in multimodal HRI and flexible triboelectric interfaces is provided in Supplementary Table 4.

Validation of decoupled multimodal mapping

Having validated the high performance of the individual tactile and strain sensing modalities, we now present the unified framework that integrates these data streams into intuitive robotic commands [Figure 5A]. The system employs three concurrent mapping strategies: Mapping I (Dexterous grasping) translates the decoded signals from the five Finger-fibers into corresponding motions of the joints of robotic hands. Mapping II (Arm elevation) utilizes the resistive signal from the Wrist-pad to control the vertical movement of the robotic arm. Mapping III (Path commands) leverages the digitally encoded signals from the Wrist-pad’s triboelectric array to guide the arm’s path or trigger specific tasks. The overall motion mapping framework is supported by integrated modules for signal acquisition, processing, recognition, and digitization, enabling high-fidelity transfer of human motions to robotic systems for intelligent operation across various scenarios, including industrial automation, household assistance, and HRI.

The Touch-Code Glove: a multimodal mapping interface with triboelectric-digital encoding for intuitive robot training

Figure 5. Validation of the decoupled multimodal mapping strategies. (A) Schematic of the unified mapping framework; (B) Validation of Mapping I (Finger-to-Hand), showing gesture mimicry and the confusion matrix for gesture recognition; (C) Validation of Mapping II (Wrist-to-Arm), demonstrating real-time correspondence between wrist motion and arm elevation; (D) Validation of Mapping III (Touch-to-Path), showcasing robotic path following guided by touch inputs.

To demonstrate the practical feasibility of this mapping method, we individually validated each mapping strategy. First, a human trainer wearing the Touch-Code Glove performed predefined gestures, which were sensed by the Finger-fibers, converted into feature-rich signals, and transmitted to the robotic system to drive the dexterous hand in mimicking the corresponding gestures [Figure 5B(i)]. Due to variations in hand size and gesture habits among individuals, the signals generated from the same gesture can differ. To address this signal variability and ensure robust recognition, we propose a one-dimensional convolutional neural network (1D-CNN) for accurate gesture recognition using the Touch-Code Glove [Supplementary Text 10, Supplementary Figure 21 and Supplementary Table 5]. The high classification accuracy demonstrated by the confusion matrix confirms the model’s efficacy in decoding user intent from the raw sensor data [Figure 5B(ii)]. Furthermore, to illustrate the real-time motion mapping between the human and the robotic arm systems, Figure 5C presents the mapping information between wrist flexion/extension and the movement of the robotic arm. During wrist extension, the robotic arm raises, and during wrist flexion, it lowers. The measured ΔRWrist/RWrist_0 signal reflects a clear bidirectional trend, consistent with the wrist posture and robotic output, validating the effectiveness of the Touch-Code Glove’s motion mapping strategy in real-time robotic control. To validate the feasibility of Mapping III, we conducted a series of demonstrations in which the user guided the robotic arm by executing touch interactions on the triboelectric array embedded within the Wrist-pad [Figure 5D]. Upon contact, distinct digital codes were generated based on the triboelectric polarity contrast between electrode layers, with each activated node encoded as a “1”, “-1”, or “0”. These codes were arranged sequentially to form directional command strings that determined the motion path of the robotic arm in real time [Figure 5D(i)]. As shown in Figure 5D(ii), the robotic arm successfully followed the touch path outlined by the user, transitioning smoothly across multiple spatial locations. The composite code sequences reflected the spatial and temporal characteristics of the touch path, enabling intuitive, gesture-free control over robotic motion. This confirms the efficacy of Mapping III in translating discrete tactile interactions into structured commands for intelligent task execution. Since HRI requires multi-user generalizability, both the CNN-LSTM and 1D-CNN models were systematically evaluated, with detailed results provided in Supplementary Text 11, Supplementary Figures 22 and 23, Supplementary Tables 6 and 7, confirming reliable recognition performance across diverse users. For more instructions, explanations, and the logic of command editing [Supplementary Text 12, Supplementary Figures 24 and 25]. The construction of all signal acquisition circuits, including the anti-electromagnetic interference and anti-static protection methods, is detailed in Supplementary Text 13, Supplementary Figures 26 and 27. All the mapping demonstrations are detailed in Supplementary Movie 1.

Synergistic control for complex robotic manipulation

To demonstrate the holistic capabilities of the Touch-Code Glove in a realistic scenario, we performed a complex task sequence involving object grasping, transport, and task execution. The demonstration vividly showcases the seamless integration and intuitive switching between the three mapping modalities [Figure 6A]. For instance, Mapping I was employed for the delicate grasping of the cleaning tool and the fruits. Mapping II controlled the vertical positioning of the robotic arm, as seen when lowering it to engage with the tabletop or lifting it during transport. Mapping III functioned as the channel for high-level commands, guiding the arm to perform cleaning motions on the surface or directing its movement toward the location of the fruit. Figure 6B(i)-(iv) vividly illustrates a seamless human-robot collaboration enabled by the Touch-Code Glove through its three-layered motion mapping strategies. In Figure 6B(i), the trainer performs a natural finger grasping gesture while wearing the Touch-Code Glove, triggering Mapping I to control the dexterous hand to firmly grasp a cleaning tool. In Figure 6B(ii), a downward wrist flexion gesture activates Mapping II, lowering the robotic arm joint to bring the tool into close contact with the tabletop surface. Then, in Figure 6B(iii), the trainer draws a predefined touch path on the Wrist-pad using their fingertip. This gesture is encoded via Mapping III, commanding the robotic arm to rotate its joints accordingly, thereby guiding the dexterous hand to perform a surface-cleaning motion. Finally, in Figure 6B(iv), the trainer bends the wrist upward, once again invoking Mapping II to raise the robotic arm and return it to its initial position. Figure 6B(v) displays the synchronized multimodal sensing signals collected throughout this interaction. The distinct ΔR/R0 signal patterns from the thumb, index, middle, ring, little finger, and wrist demonstrate the glove’s ability to clearly resolve each sequential action with high temporal and spatial resolution, validating the robustness and responsiveness of the system during dynamic human-robot collaboration.

The Touch-Code Glove: a multimodal mapping interface with triboelectric-digital encoding for intuitive robot training

Figure 6. Intuitive HRI demonstration enabled by the Touch-Code Glove. (A) Application scenario showing a human trainer wearing the Touch-Code Glove to intuitively control an embodied intelligent robot for household tasks such as room cleaning and object grasping via three mapping strategies (Mapping I-III); (B) Multimodal sensor signal outputs during a continuous HRI sequence. (i) Finger grasping (Mapping I), (ii) wrist flexion (Mapping II), (iii) touch path gesture (Mapping III), and (iv) wrist extension (Mapping II) are sequentially performed to generate distinct signal patterns for gesture classification and task coordination; (C) Sequential snapshots of robotic task execution guided by Touch-Code Glove-based gestures. HRI: Human-robot interaction.

Critically, the system supports the synergistic combination of these modalities. This is exemplified in the initial grasping action, where the user simultaneously closes the hand (Mapping I) and flexes the wrist downward (Mapping II) to grasp the orange [Figure 6C(i) and (ii)]. Later, all three mappings are engaged to lift and position the object, demonstrating seamless coordination between gesture-based intent and robotic execution [Figure 6C(iii)-(vi)]. This scenario continues with a second grasping task involving an apple [Figure 6C(vii)-(ix)], during which the trainer repeatedly employs combinations of Mapping I (for grasping and release), Mapping II (for vertical motion), and Mapping III (for spatial repositioning). This sequence validates the Touch-Code Glove’s ability to facilitate natural, real-time, multi-step robot control, reinforcing its applicability for dexterous and intuitive manipulation tasks in domestic and service-oriented robotics. The corresponding robotic manipulation video is provided in Supplementary Movie 2. To further demonstrate robustness beyond predefined task sequences, grasping tests under dynamic and unpredictable scenarios were performed, including moving objects and objects of different materials, as provided in Supplementary Movie 3. To ensure suitability for long-term real-world use, the ergonomics and comfort of the Touch-Code Glove were systematically evaluated, with detailed results provided in Supplementary Text 14 and Supplementary Figures 28 and 29, confirming comfortable wearability and providing a basis for future material optimization.

CONCLUSIONS

This research developed the Touch-Code Glove, an integrated multimodal interface providing intuitive human-to-robot motion mapping. Utilizing strain-sensitive hydrogels and triboelectric tactile films, the glove robustly captures and digitizes finger, wrist, and touch interactions. A novel triboelectric-digital encoding method efficiently translates complex tactile inputs into programmable robot commands, supported by CNN-LSTM algorithms for accurate interpretation of multi-touch gestures. Experimental demonstrations confirmed the glove’s effectiveness for real-time robotic control across three distinct modalities, greatly reducing the training barrier for robotic manipulation tasks. Future developments will focus on enhancing the durability and ergonomics of the materials, optimizing sensor fusion algorithms, and expanding the glove’s applicability to diverse robotic environments.

DECLARATIONS

Authors’ contributions

Made substantial contributions to conception and design of the study: Sun, Y.; Liu, H.

Planned and performed the experiments: Sun, Y.; Yang, R.; Zhou, Z.; Ji, T.

Wrote the control programs and algorithms: Li, D.; Lu, B.

Provided chemical materials and experimental equipment: Sun, L.

Analyzed the data and drafted the manuscript: Sun, Y.

Edited the manuscript: Sun, Y.; Liu, H.

Availability of data and materials

The authors declare that the primary data supporting the findings of this study are available within the paper and its Supplementary Materials. Additional data are available from the corresponding authors upon reasonable request.

Financial support and sponsorship

This work is supported by the National Natural Science Foundation of China (NSFC, 62401385), the Natural Science Foundation of Jiangsu Province (BK20240803), and the National Key Research and Development Program of China (2023YFB4705200).

Conflicts of interest

All authors declared that there are no conflicts of interest.

Ethical approval and consent to participate

This study complied with local regulations and ethical standards, involving only non-invasive monitoring of body movement. Without sensitive personal data, commercial interests, or participant harm, it was exempt from formal ethical review. All signal tests involving human participants were performed with the consent of the participant (an author of the study).

Consent for publication

Not applicable.

Copyright

© The Author(s) 2025.

Supplementary Materials

REFERENCES

1. Bartolozzi, C.; Indiveri, G.; Donati, E. Author Correction: Embodied neuromorphic intelligence. Nat. Commun. 2022, 13, 1415.

2. Zhao, Z.; Wu, Q.; Wang, J.; Zhang, B.; Zhong, C.; Zhilenkov, A. A. Exploring embodied intelligence in soft robotics: a review. Biomimetics 2024, 9, 248.

3. Mon-Williams, R.; Li, G.; Long, R.; Du, W.; Lucas, C. G. Embodied large language models enable robots to complete complex tasks in unpredictable environments. Nat. Mach. Intell. 2025, 7, 592-601.

4. Khan, A. T.; Li, S.; Cao, X. Human guided cooperative robotic agents in smart home using beetle antennae search. Sci. China. Inf. Sci. 2022, 65, 122204.

5. Eirale, A.; Martini, M.; Tagliavini, L.; Gandini, D.; Chiaberge, M.; Quaglia, G. Marvin: an innovative omni-directional robotic assistant for domestic environments. Sensors 2022, 22, 5261.

6. Gentile, C.; Lunghi, G.; Buonocore, L. R.; et al. Manipulation tasks in hazardous environments using a teleoperated robot: a case study at CERN. Sensors 2023, 23, 1979.

7. Szczurek, K. A.; Prades, R. M.; Matheson, E.; Rodriguez-Nogueira, J.; Di Castro, M. Multimodal multi-user mixed reality human–robot interface for remote operations in hazardous environments. IEEE. Access. 2023, 11, 17305-33.

8. Soori, M.; Dastres, R.; Arezoo, B.; Karimi Ghaleh Jough, F. Intelligent robotic systems in Industry 4.0: a review. J. Adv. Manuf. Sci. Technol. 2024, 4, 2024007.

9. Arents, J.; Greitans, M. Smart industrial robot control trends, challenges and opportunities within manufacturing. Appl. Sci. 2022, 12, 937.

10. Hou, C.; Gao, H.; Yang, X.; et al. A piezoresistive-based 3-axial MEMS tactile sensor and integrated surgical forceps for gastrointestinal endoscopic minimally invasive surgery. Microsyst. Nanoeng. 2024, 10, 141.

11. Hou, C.; Wang, K.; Wang, F.; et al. A highly integrated 3D MEMS force sensing module with variable sensitivity for robotic-assisted minimally invasive surgery. Adv. Funct. Mater. 2023, 33, 2302812.

12. Mengaldo, G.; Renda, F.; Brunton, S. L.; et al. A concise guide to modelling the physics of embodied intelligence in soft robotics. Nat. Rev. Phys. 2022, 4, 595-610.

13. Liu, W.; Duo, Y.; Liu, J.; et al. Touchless interactive teaching of soft robots through flexible bimodal sensory interfaces. Nat. Commun. 2022, 13, 5030.

14. Fan, H.; Liu, X.; Fuh, J. Y. H.; Lu, W. F.; Li, B. Embodied intelligence in manufacturing: leveraging large language models for autonomous industrial robotics. J. Intell. Manuf. 2025, 36, 1141-57.

15. Sun, Z.; Zhu, M.; Lee, C. Progress in the triboelectric human–machine interfaces (HMIs)-moving from smart gloves to AI/haptic enabled HMI in the 5G/IoT era. Nanoenergy. Adv. 2021, 1, 81-120.

16. Wang, T.; Zheng, P.; Li, S.; Wang, L. Multimodal human–robot interaction for human-centric smart manufacturing: a survey. Adv. Intell. Syst. 2024, 6, 2300359.

17. Chen, S.; Pang, Y.; Cao, Y.; Tan, X.; Cao, C. Soft robotic manipulation system capable of stiffness variation and dexterous operation for safe human–machine interactions. Adv. Mater. Technol. 2021, 6, 2100084.

18. Sun, T.; Yao, C.; Liu, Z.; et al. Machine learning-coupled vertical graphene triboelectric pressure sensors array as artificial tactile receptor for finger action recognition. Nano. Energy. 2024, 123, 109395.

19. Zhu, M.; Sun, Z.; Chen, T.; Lee, C. Low cost exoskeleton manipulator using bidirectional triboelectric sensors enhanced multiple degree of freedom sensory system. Nat. Commun. 2021, 12, 2692.

20. Fang, P.; Zhu, M.; Zeng, Z.; et al. A multi-module sensing and Bi-directional HMI integrating interaction, recognition, and feedback for intelligent robots. Adv. Funct. Mater. 2024, 34, 2310254.

21. Hou, C.; Geng, J.; Yang, Z.; et al. A delta-parallel-inspired human machine interface by using self-powered triboelectric nanogenerator toward 3D and VR/AR manipulations. Adv. Mater. Technol. 2021, 6, 2000912.

22. He, T.; Sun, Z.; Shi, Q.; et al. Self-powered glove-based intuitive interface for diversified control applications in real/cyber space. Nano. Energy. 2019, 58, 641-51.

23. Hang, C. Z.; Zhao, X. F.; Xi, S. Y.; et al. Highly stretchable and self-healing strain sensors for motion detection in wireless human-machine interface. Nano. Energy. 2020, 76, 105064.

24. Zhu, M.; Sun, Z.; Zhang, Z.; et al. Haptic-feedback smart glove as a creative human-machine interface (HMI) for virtual/augmented reality applications. Sci. Adv. 2020, 6, eaaz8693.

25. Yang, B.; Cheng, J.; Qu, X.; et al. Triboelectric-inertial sensing glove enhanced by charge-retained strategy for human-machine interaction. Adv. Sci. 2025, 12, e2408689.

26. Zhao, Z.; Li, W.; Li, Y.; et al. Embedding high-resolution touch across robotic hands enables adaptive human-like grasping. Nat. Mach. Intell. 2025, 7, 889-900.

27. Lee, G. H.; Lee, Y. R.; Kim, H.; et al. Rapid meniscus-guided printing of stable semi-solid-state liquid metal microgranular-particle for soft electronics. Nat. Commun. 2022, 13, 2643.

28. Yin, L.; Kim, K. N.; Lv, J.; et al. A self-sustainable wearable multi-modular E-textile bioenergy microgrid system. Nat. Commun. 2021, 12, 1542.

29. Jan, A. A.; Kim, S.; Kim, S. A skin-wearable and self-powered laminated pressure sensor based on triboelectric nanogenerator for monitoring human motion. Soft. Sci. 2024, 4, 10.

30. Zhang, W.; Qin, X.; Li, G.; et al. Self-powered triboelectric-responsive microneedles with controllable release of optogenetically engineered extracellular vesicles for intervertebral disc degeneration repair. Nat. Commun. 2024, 15, 5736.

31. Jin, G.; Sun, Y.; Geng, J.; et al. Bioinspired soft caterpillar robot with ultra-stretchable bionic sensors based on functional liquid metal. Nano. Energy. 2021, 84, 105896.

32. Sun, Z.; Zhu, M.; Shan, X.; Lee, C. Augmented tactile-perception and haptic-feedback rings as human-machine interfaces aiming for immersive interactions. Nat. Commun. 2022, 13, 5224.

33. Wu, P.; Yiu, C. K.; Huang, X.; et al. Liquid metal-based strain-sensing glove for human-machine interaction. Soft. Sci. 2023, 3, 35.

34. Li, Z.; Li, Z.; Tang, W.; et al. Crossmodal sensory neurons based on high-performance flexible memristors for human-machine in-sensor computing system. Nat. Commun. 2024, 15, 7275.

35. Liao, X.; Song, W.; Zhang, X.; et al. A bioinspired analogous nerve towards artificial intelligence. Nat. Commun. 2020, 11, 268.

36. Wang, W.; Jiang, Y.; Zhong, D.; et al. Neuromorphic sensorimotor loop embodied by monolithically integrated, low-voltage, soft e-skin. Science 2023, 380, 735-42.

37. Liu, F.; Deswal, S.; Christou, A.; Sandamirskaya, Y.; Kaboli, M.; Dahiya, R. Neuro-inspired electronic skin for robots. Sci. Robot. 2022, 7, eabl7344.

38. Niu, H.; Li, H.; Gao, S.; et al. Perception-to-cognition tactile sensing based on artificial-intelligence-motivated human full-skin bionic electronic skin. Adv. Mater. 2022, 34, e2202622.

39. Xu, J.; Sun, X.; Sun, B.; et al. Stretchable, adhesive, and bioinspired visual electronic skin with strain/temperature/pressure multimodal non-interference sensing. ACS. Appl. Mater. Interfaces. 2023, 15, 33774-83.

40. Guo, X.; Sun, Z.; Zhu, Y.; Lee, C. Zero-biased bionic fingertip E-skin with multimodal tactile perception and artificial intelligence for augmented touch awareness. Adv. Mater. 2024, 36, e2406778.

41. Li, S.; Chen, X.; Li, X.; et al. Bioinspired robot skin with mechanically gated electron channels for sliding tactile perception. Sci. Adv. 2022, 8, ade0720.

42. Li, S.; Liu, S.; Wang, L.; Zhu, R. Skin-inspired quadruple tactile sensors integrated on a robot hand enable object recognition. Sci. Robot. 2020, 5, abc8134.

43. Chen, T.; Shi, Q.; Zhu, M.; et al. Triboelectric self-powered wearable flexible patch as 3D motion control interface for robotic manipulator. ACS. Nano. 2018, 12, 11561-71.

44. Shao, B.; Lu, M. H.; Wu, T. C.; et al. Large-area, untethered, metamorphic, and omnidirectionally stretchable multiplexing self-powered triboelectric skins. Nat. Commun. 2024, 15, 1238.

45. Xie, X.; Wang, Q.; Zhao, C.; et al. Neuromorphic computing-assisted triboelectric capacitive-coupled tactile sensor array for wireless mixed reality interaction. ACS. Nano. 2024, 18, 17041-52.

46. Sun, Y.; Chen, T.; Li, D.; et al. Stretchable, multiplexed, and bimodal sensing electronic armor for colonoscopic continuum robot enhanced by triboelectric artificial synapse. Adv. Mater. 2025, 37, e2502203.

Cite This Article

Research Article
Open Access
The Touch-Code Glove: a multimodal mapping interface with triboelectric-digital encoding for intuitive robot training

How to Cite

Download Citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click on download.

Export Citation File:

Type of Import

Tips on Downloading Citation

This feature enables you to download the bibliographic information (also called citation data, header data, or metadata) for the articles on our site.

Citation Manager File Format

Use the radio buttons to choose how to format the bibliographic data you're harvesting. Several citation manager formats are available, including EndNote and BibTex.

Type of Import

If you have citation management software installed on your computer your Web browser should be able to import metadata directly into your reference database.

Direct Import: When the Direct Import option is selected (the default state), a dialogue box will give you the option to Save or Open the downloaded citation data. Choosing Open will either launch your citation manager or give you a choice of applications with which to use the metadata. The Save option saves the file locally for later use.

Indirect Import: When the Indirect Import option is selected, the metadata is displayed and may be copied and pasted as needed.

About This Article

© The Author(s) 2025. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, sharing, adaptation, distribution and reproduction in any medium or format, for any purpose, even commercially, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Data & Comments

Data

Views
33
Downloads
1
Citations
0
Comments
0
0

Comments

Comments must be written in English. Spam, offensive content, impersonation, and private information will not be permitted. If any comment is reported and identified as inappropriate content by OAE staff, the comment will be removed without notice. If you have any queries or need any help, please contact us at [email protected].

0
Download PDF
Share This Article
Scan the QR code for reading!
See Updates
Contents
Figures
Related
Soft Science
ISSN 2769-5441 (Online)
Follow Us

Portico

All published articles are preserved here permanently:

https://www.portico.org/publishers/oae/

Portico

All published articles are preserved here permanently:

https://www.portico.org/publishers/oae/