Receive KAIST news by email!
Type your e-mail address here.
by recently order
by view order
A Deep-Learned E-Skin Decodes Complex Human Motion
A deep-learning powered single-strained electronic skin sensor can capture human motion from a distance. The single strain sensor placed on the wrist decodes complex five-finger motions in real time with a virtual 3D hand that mirrors the original motions. The deep neural network boosted by rapid situation learning (RSL) ensures stable operation regardless of its position on the surface of the skin. Conventional approaches require many sensor networks that cover the entire curvilinear surfaces of the target area. Unlike conventional wafer-based fabrication, this laser fabrication provides a new sensing paradigm for motion tracking. The research team, led by Professor Sungho Jo from the School of Computing, collaborated with Professor Seunghwan Ko from Seoul National University to design this new measuring system that extracts signals corresponding to multiple finger motions by generating cracks in metal nanoparticle films using laser technology. The sensor patch was then attached to a user’s wrist to detect the movement of the fingers. The concept of this research started from the idea that pinpointing a single area would be more efficient for identifying movements than affixing sensors to every joint and muscle. To make this targeting strategy work, it needs to accurately capture the signals from different areas at the point where they all converge, and then decoupling the information entangled in the converged signals. To maximize users’ usability and mobility, the research team used a single-channeled sensor to generate the signals corresponding to complex hand motions. The rapid situation learning (RSL) system collects data from arbitrary parts on the wrist and automatically trains the model in a real-time demonstration with a virtual 3D hand that mirrors the original motions. To enhance the sensitivity of the sensor, researchers used laser-induced nanoscale cracking. This sensory system can track the motion of the entire body with a small sensory network and facilitate the indirect remote measurement of human motions, which is applicable for wearable VR/AR systems. The research team said they focused on two tasks while developing the sensor. First, they analyzed the sensor signal patterns into a latent space encapsulating temporal sensor behavior and then they mapped the latent vectors to finger motion metric spaces. Professor Jo said, “Our system is expandable to other body parts. We already confirmed that the sensor is also capable of extracting gait motions from a pelvis. This technology is expected to provide a turning point in health-monitoring, motion tracking, and soft robotics.” This study was featured in Nature Communications. Publication: Kim, K. K., et al. (2020) A deep-learned skin sensor decoding the epicentral human motions. Nature Communications. 11. 2149. https://doi.org/10.1038/s41467-020-16040-y29 Link to download the full-text paper: https://www.nature.com/articles/s41467-020-16040-y.pdf Profile: Professor Sungho Jo email@example.com http://nmail.kaist.ac.kr Neuro-Machine Augmented Intelligence Lab School of Computing College of Engineering KAIST
Wearable Strain Sensor Using Light Transmittance Helps Measure Physical Signals Better
KAIST researchers have developed a novel wearable strain sensor based on the modulation of optical transmittance of a carbon nanotube (CNT)-embedded elastomer. The sensor is capable of sensitive, stable, and continuous measurement of physical signals. This technology, featured in the March 4th issue of ACS Applied Materials & Interfaces as a front cover article, shows great potential for the detection of subtle human motions and the real-time monitoring of body postures for healthcare applications. A wearable strain sensor must have high sensitivity, flexibility, and stretchability, as well as low cost. Those used especially for health monitoring should also be tied to long-term solid performance, and be environmentally stable. Various stretchable strain sensors based on piezo-resistive and capacitive principles have been developed to meet all these requirements. Conventional piezo-resistive strain sensors using functional nanomaterials, including CNTs as the most common example, have shown high sensitivity and great sensing performance. However, they suffer from poor long-term stability and linearity, as well as considerable signal hysteresis. As an alternative, piezo-capacitive strain sensors with better stability, lower hysteresis, and higher stretchability have been suggested. But due to the fact that piezo-capacitive strain sensors exhibit limited sensitivity and strong electromagnetic interference caused by the conductive objects in the surrounding environment, these conventional stretchable strain sensors are still facing limitations that are yet to be resolved. A KAIST research team led by Professor Inkyu Park from the Department of Mechanical Engineering suggested that an optical-type stretchable strain sensor can be a good alternative to resolve the limitations of conventional piezo-resistive and piezo-capacitive strain sensors, because they have high stability and are less affected by environmental disturbances. The team then introduced an optical wearable strain sensor based on the light transmittance changes of a CNT-embedded elastomer, which further addresses the low sensitivity problem of conventional optical stretchable strain sensors. In order to achieve a large dynamic range for the sensor, Professor Park and his researchers chose Ecoflex as an elastomeric substrate with good mechanical durability, flexibility, and attachability on human skin, and the new optical wearable strain sensor developed by the research group actually shows a wide dynamic range of 0 to 400%. In addition, the researchers propagated the microcracks under tensile strain within the film of multi-walled CNTs embedded in the Ecoflex substrate, changing the optical transmittance of the film. By doing so, it was possible for them to develop a wearable strain sensor having a sensitivity 10 times higher than conventional optical stretchable strain sensors. The proposed sensor has also passed the durability test with excellent results. The sensor’s response after 13,000 sets of cyclic loading was stable without any noticeable drift. This suggests that the sensor response can be used without degradation, even if the sensor is repeatedly used for a long time and in various environmental conditions. Using the developed sensor, the research team could measure the finger bending motion and used it for robot control. They also developed a three-axes sensor array for body posture monitoring. The sensor was able to monitor human motions with small strains such as a pulse near the carotid artery and muscle movement around the mouth during pronunciation. Professor Park said, “In this study, our group developed a new wearable strain sensor platform that overcomes many limitations of previously developed resistive, capacitive, and optical-type stretchable strain sensors. Our sensor could be widely used in a variety of fields including soft robotics, wearable electronics, electronic skin, healthcare, and even entertainment.” This work was supported by the National Research Foundation (NRF) of Korea. Publication: Jimin Gu, Donguk Kwon, Junseong Ahn, and Inkyu Park. (2020) “Wearable Strain sensors Using Light Transmittance Change of Carbon Nanotube-Embedded Elastomers with Microcracks” ACS Applied Materials & Interfaces. Volume 12. Issue 9. Available online at https://doi.org/10.1021/acsami.9b18069 Profile: Inkyu Park Professor firstname.lastname@example.org http://mintlab1.kaist.ac.kr Micro/Nano Transducers Laboratory (MINT Lab) Department of Mechanical Engineering (ME)Korea Advanced Institute of Science and Technology (KAIST) Profile: Jimin Gu Ph.D. Candidate email@example.com http://mintlab1.kaist.ac.kr MINT Lab KAIST ME (END)
마지막 페이지 1
KAIST, 291 Daehak-ro, Yuseong-gu, Daejeon 34141, Republic of Korea
Copyright(C) 2020, Korea Advanced Institute of Science and Technology,
All Rights Reserved.