본문 바로가기
대메뉴 바로가기
KAIST
Newsletter Vol.25
Receive KAIST news by email!
View
Subscribe
Close
Type your e-mail address here.
Subscribe
Close
KAIST
NEWS
유틸열기
홈페이지 통합검색
-
검색
KOREAN
메뉴 열기
Electrical+Engineering
by recently order
by view order
KAIST Secures Core Technology for Ultra-High-Resolution Image Sensors
A joint research team from Korea and the United States has developed next-generation, high-resolution image sensor technology with higher power efficiency and a smaller size compared to existing sensors. Notably, they have secured foundational technology for ultra-high-resolution shortwave infrared (SWIR) image sensors, an area currently dominated by Sony, paving the way for future market entry. KAIST (represented by President Kwang Hyung Lee) announced on the 20th of November that a research team led by Professor SangHyeon Kim from the School of Electrical Engineering, in collaboration with Inha University and Yale University in the U.S., has developed an ultra-thin broadband photodiode (PD), marking a significant breakthrough in high-performance image sensor technology. This research drastically improves the trade-off between the absorption layer thickness and quantum efficiency found in conventional photodiode technology. Specifically, it achieved high quantum efficiency of over 70% even in an absorption layer thinner than one micrometer (μm), reducing the thickness of the absorption layer by approximately 70% compared to existing technologies. A thinner absorption layer simplifies pixel processing, allowing for higher resolution and smoother carrier diffusion, which is advantageous for light carrier acquisition while also reducing the cost. However, a fundamental issue with thinner absorption layers is the reduced absorption of long-wavelength light. < Figure 1. Schematic diagram of the InGaAs photodiode image sensor integrated on the Guided-Mode Resonance (GMR) structure proposed in this study (left), a photograph of the fabricated wafer, and a scanning electron microscope (SEM) image of the periodic patterns (right) > The research team introduced a guided-mode resonance (GMR) structure* that enables high-efficiency light absorption across a wide spectral range from 400 nanometers (nm) to 1,700 nanometers (nm). This wavelength range includes not only visible light but also light the SWIR region, making it valuable for various industrial applications. *Guided-Mode Resonance (GMR) Structure: A concept used in electromagnetics, a phenomenon in which a specific (light) wave resonates (forming a strong electric/magnetic field) at a specific wavelength. Since energy is maximized under these conditions, it has been used to increase antenna or radar efficiency. The improved performance in the SWIR region is expected to play a significant role in developing next-generation image sensors with increasingly high resolutions. The GMR structure, in particular, holds potential for further enhancing resolution and other performance metrics through hybrid integration and monolithic 3D integration with complementary metal-oxide-semiconductor (CMOS)-based readout integrated circuits (ROIC). < Figure 2. Benchmark for state-of-the-art InGaAs-based SWIR pixels with simulated EQE lines as a function of TAL variation. Performance is maintained while reducing the absorption layer thickness from 2.1 micrometers or more to 1 micrometer or less while reducing it by 50% to 70% > The research team has significantly enhanced international competitiveness in low-power devices and ultra-high-resolution imaging technology, opening up possibilities for applications in digital cameras, security systems, medical and industrial image sensors, as well as future ultra-high-resolution sensors for autonomous driving, aerospace, and satellite observation. Professor Sang Hyun Kim, the lead researcher, commented, “This research demonstrates that significantly higher performance than existing technologies can be achieved even with ultra-thin absorption layers.” < Figure 3. Top optical microscope image and cross-sectional scanning electron microscope image of the InGaAs photodiode image sensor fabricated on the GMR structure (left). Improved quantum efficiency performance of the ultra-thin image sensor (red) fabricated with the technology proposed in this study (right) > The results of this research were published on 15th of November, in the prestigious international journal Light: Science & Applications (JCR 2.9%, IF=20.6), with Professor Dae-Myung Geum of Inha University (formerly a KAIST postdoctoral researcher) and Dr. Jinha Lim (currently a postdoctoral researcher at Yale University) as co-first authors. (Paper title: “Highly-efficient (>70%) and Wide-spectral (400 nm -1700 nm) sub-micron-thick InGaAs photodiodes for future high-resolution image sensors”) This study was supported by the National Research Foundation of Korea.
2024.11.22
View 80
KAIST Researchers Introduce New and Improved, Next-Generation Perovskite Solar Cell
- KAIST-Yonsei university researchers developed innovative dipole technology to maximize near-infrared photon harvesting efficiency - Overcoming the shortcoming of existing perovskite solar cells that cannot utilize approximately 52% of total solar energy - Development of next-generation solar cell technology with high efficiency and high stability that can absorb near-infrared light beyond the existing visible light range with a perovskite-dipole-organic semiconductor hybrid structure < Photo. (From left) Professor Jung-Yong Lee, Ph.D. candidate Min-Ho Lee, and Master’s candidate Min Seok Kim of the School of Electrical Engineering > Existing perovskite solar cells, which have the problem of not being able to utilize approximately 52% of total solar energy, have been developed by a Korean research team as an innovative technology that maximizes near-infrared light capture performance while greatly improving power conversion efficiency. This greatly increases the possibility of commercializing next-generation solar cells and is expected to contribute to important technological advancements in the global solar cell market. The research team of Professor Jung-Yong Lee of the School of Electrical Engineering at KAIST (President Kwang-Hyung Lee) and Professor Woojae Kim of the Department of Chemistry at Yonsei University announced on October 31st that they have developed a high-efficiency and high-stability organic-inorganic hybrid solar cell production technology that maximizes near-infrared light capture beyond the existing visible light range. The research team suggested and advanced a hybrid next-generation device structure with organic photo-semiconductors that complements perovskite materials limited to visible light absorption and expands the absorption range to near-infrared. In addition, they revealed the electronic structure problem that mainly occurs in the structure and announced a high-performance solar cell device that dramatically solved this problem by introducing a dipole layer*. *Dipole layer: A thin material layer that controls the energy level within the device to facilitate charge transport and forms an interface potential difference to improve device performance. Existing lead-based perovskite solar cells have a problem in that their absorption spectrum is limited to the visible light region with a wavelength of 850 nanometers (nm) or less, which prevents them from utilizing approximately 52% of the total solar energy. To solve this problem, the research team designed a hybrid device that combined an organic bulk heterojunction (BHJ) with perovskite and implemented a solar cell that can absorb up to the near-infrared region. In particular, by introducing a sub-nanometer dipole interface layer, they succeeded in alleviating the energy barrier between the perovskite and the organic bulk heterojunction (BHJ), suppressing charge accumulation, maximizing the contribution to the near-infrared, and improving the current density (JSC) to 4.9 mA/cm². The key achievement of this study is that the power conversion efficiency (PCE) of the hybrid device has been significantly increased from 20.4% to 24.0%. In particular, this study achieved a high internal quantum efficiency (IQE) compared to previous studies, reaching 78% in the near-infrared region. < Figure. The illustration of the mechanism of improving the electronic structure and charge transfer capability through Perovskite/organic hybrid device structure and dipole interfacial layers (DILs). The proposed dipole interfacial layer forms a strong interfacial dipole, effectively reducing the energy barrier between the perovskite and organic bulk heterojunction (BHJ), and suppressing hole accumulation. This technology improves near-infrared photon harvesting and charge transfer, and as a result, the power conversion efficiency of the solar cell increases to 24.0%. In addition, it achieves excellent stability by maintaining performance for 1,200 hours even in an extremely humid environment. > In addition, this device showed high stability, showing excellent results of maintaining more than 80% of the initial efficiency in the maximum output tracking for more than 800 hours even under extreme humidity conditions. Professor Jung-Yong Lee said, “Through this study, we have effectively solved the charge accumulation and energy band mismatch problems faced by existing perovskite/organic hybrid solar cells, and we will be able to significantly improve the power conversion efficiency while maximizing the near-infrared light capture performance, which will be a new breakthrough that can solve the mechanical-chemical stability problems of existing perovskites and overcome the optical limitations.” This study, in which KAIST School of Electrical Engineering Ph.D. candidate Min-Ho Lee and Master's candidate Min Seok Kim participated as co-first authors, was published in the September 30th online edition of the international academic journal Advanced Materials. (Paper title: Suppressing Hole Accumulation Through Sub-Nanometer Dipole Interfaces in Hybrid Perovskite/Organic Solar Cells for Boosting Near-Infrared Photon Harvesting). This study was conducted with the support of the National Research Foundation of Korea.
2024.10.31
View 832
KAIST Proposes AI Training Method that will Drastically Shorten Time for Complex Quantum Mechanical Calculations
- Professor Yong-Hoon Kim's team from the School of Electrical Engineering succeeded for the first time in accelerating quantum mechanical electronic structure calculations using a convolutional neural network (CNN) model - Presenting an AI learning principle of quantum mechanical 3D chemical bonding information, the work is expected to accelerate the computer-assisted designing of next-generation materials and devices The close relationship between AI and high-performance scientific computing can be seen in the fact that both the 2024 Nobel Prizes in Physics and Chemistry were awarded to scientists for their AI-related research contributions in their respective fields of study. KAIST researchers succeeded in dramatically reducing the computation time for highly sophisticated quantum mechanical computer simulations by predicting atomic-level chemical bonding information distributed in 3D space using a novel AI approach. KAIST (President Kwang-Hyung Lee) announced on the 30th of October that Professor Yong-Hoon Kim's team from the School of Electrical Engineering developed a 3D computer vision artificial neural network-based computation methodology that bypasses the complex algorithms required for atomic-level quantum mechanical calculations traditionally performed using supercomputers to derive the properties of materials. < Figure 1. Various methodologies are utilized in the simulation of materials and materials, such as quantum mechanical calculations at the nanometer (nm) level, classical mechanical force fields at the scale of tens to hundreds of nanometers, continuum dynamics calculations at the macroscopic scale, and calculations that mix simulations at different scales. These simulations are already playing a key role in a wide range of basic research and application development fields in combination with informatics techniques. Recently, there have been active efforts to introduce machine learning techniques to radically accelerate simulations, but research on introducing machine learning techniques to quantum mechanical electronic structure calculations, which form the basis of high-scale simulations, is still insufficient. > The quantum mechanical density functional theory (DFT) calculations using supercomputers have become an essential and standard tool in a wide range of research and development fields, including advanced materials and drug design, as they allow fast and accurate prediction of material properties. *Density functional theory (DFT): A representative theory of ab initio (first principles) calculations that calculate quantum mechanical properties from the atomic level. However, practical DFT calculations require generating 3D electron density and solving quantum mechanical equations through a complex, iterative self-consistent field (SCF)* process that must be repeated tens to hundreds of times. This restricts its application to systems with only a few hundred to a few thousand atoms. *Self-consistent field (SCF): A scientific computing method widely used to solve complex many-body problems that must be described by a number of interconnected simultaneous differential equations. Professor Yong-Hoon Kim’s research team questioned whether recent advancements in AI techniques could be used to bypass the SCF process. As a result, they developed the DeepSCF model, which accelerates calculations by learning chemical bonding information distributed in a 3D space using neural network algorithms from the field of computer vision. < Figure 2. The deepSCF methodology developed in this study provides a way to rapidly accelerate DFT calculations by avoiding the self-consistent field process (orange box) that had to be performed repeatedly in traditional quantum mechanical electronic structure calculations through artificial neural network techniques (green box). The self-consistent field process is a process of predicting the 3D electron density, constructing the corresponding potential, and then solving the quantum mechanical Cohn-Sham equations, repeating tens to hundreds of times. The core idea of the deepSCF methodology is that the residual electron density (δρ), which is the difference between the electron density (ρ) and the sum of the electron densities of the constituent atoms (ρ0), corresponds to chemical bonding information, so the self-consistent field process is replaced with a 3D convolutional neural network model. > The research team focused on the fact that, according to density functional theory, electron density contains all quantum mechanical information of electrons, and that the residual electron density — the difference between the total electron density and the sum of the electron densities of the constituent atoms — contains chemical bonding information. They used this as the target for machine learning. They then adopted a dataset of organic molecules with various chemical bonding characteristics, and applied random rotations and deformations to the atomic structures of these molecules to further enhance the model’s accuracy and generalization capabilities. Ultimately, the research team demonstrated the validity and efficiency of the DeepSCF methodology on large, complex systems. < Figure 3. An example of applying the deepSCF methodology to a carbon nanotube-based DNA sequence analysis device model (top left). In addition to classical mechanical interatomic forces (bottom right), the residual electron density (top right) and quantum mechanical electronic structure properties such as the electronic density of states (DOS) (bottom left) containing information on chemical bonding are rapidly predicted with an accuracy corresponding to the standard DFT calculation results that perform the SCF process. > Professor Yong-Hoon Kim, who supervised the research, explained that his team had found a way to map quantum mechanical chemical bonding information in a 3D space onto artificial neural networks. He noted, “Since quantum mechanical electron structure calculations underpin materials simulations across all scales, this research establishes a foundational principle for accelerating material calculations using artificial intelligence.” Ryong-Gyu Lee, a PhD candidate in the School of Electrical Engineering, served as the first author of this research, which was published online on October 24 in Npj Computational Materials, a prestigious journal in the field of material computation. (Paper title: “Convolutional network learning of self-consistent electron density via grid-projected atomic fingerprints”) This research was conducted with support from the KAIST High-Risk Research Program for Graduate Students and the National Research Foundation of Korea’s Mid-career Researcher Support Program.
2024.10.30
View 797
KAIST Develops Technology for the Precise Diagnosis of Electric Vehicle Batteries Using Small Currents
Accurately diagnosing the state of electric vehicle (EV) batteries is essential for their efficient management and safe use. KAIST researchers have developed a new technology that can diagnose and monitor the state of batteries with high precision using only small amounts of current, which is expected to maximize the batteries’ long-term stability and efficiency. KAIST (represented by President Kwang Hyung Lee) announced on the 17th of October that a research team led by Professors Kyeongha Kwon and Sang-Gug Lee from the School of Electrical Engineering had developed electrochemical impedance spectroscopy (EIS) technology that can be used to improve the stability and performance of high-capacity batteries in electric vehicles. EIS is a powerful tool that measures the impedance* magnitude and changes in a battery, allowing the evaluation of battery efficiency and loss. It is considered an important tool for assessing the state of charge (SOC) and state of health (SOH) of batteries. Additionally, it can be used to identify thermal characteristics, chemical/physical changes, predict battery life, and determine the causes of failures. *Battery Impedance: A measure of the resistance to current flow within the battery that is used to assess battery performance and condition. However, traditional EIS equipment is expensive and complex, making it difficult to install, operate, and maintain. Moreover, due to sensitivity and precision limitations, applying current disturbances of several amperes (A) to a battery can cause significant electrical stress, increasing the risk of battery failure or fire and making it difficult to use in practice. < Figure 1. Flow chart for diagnosis and prevention of unexpected combustion via the use of the electrochemical impedance spectroscopy (EIS) for the batteries for electric vehicles. > To address this, the KAIST research team developed and validated a low-current EIS system for diagnosing the condition and health of high-capacity EV batteries. This EIS system can precisely measure battery impedance with low current disturbances (10mA), minimizing thermal effects and safety issues during the measurement process. In addition, the system minimizes bulky and costly components, making it easy to integrate into vehicles. The system was proven effective in identifying the electrochemical properties of batteries under various operating conditions, including different temperatures and SOC levels. Professor Kyeongha Kwon (the corresponding author) explained, “This system can be easily integrated into the battery management system (BMS) of electric vehicles and has demonstrated high measurement accuracy while significantly reducing the cost and complexity compared to traditional high-current EIS methods. It can contribute to battery diagnosis and performance improvements not only for electric vehicles but also for energy storage systems (ESS).” This research, in which Young-Nam Lee, a doctoral student in the School of Electrical Engineering at KAIST participated as the first author, was published in the prestigious international journal IEEE Transactions on Industrial Electronics (top 2% in the field; IF 7.5) on September 5th. (Paper Title: Small-Perturbation Electrochemical Impedance Spectroscopy System With High Accuracy for High-Capacity Batteries in Electric Vehicles, Link: https://ieeexplore.ieee.org/document/10666864) < Figure 2. Impedance measurement results of large-capacity batteries for electric vehicles. ZEW (commercial EW; MP10, Wonatech) versus ZMEAS (proposed system) > This research was supported by the Basic Research Program of the National Research Foundation of Korea, the Next-Generation Intelligent Semiconductor Technology Development Program of the Korea Evaluation Institute of Industrial Technology, and the AI Semiconductor Graduate Program of the Institute of Information & Communications Technology Planning & Evaluation.
2024.10.17
View 1247
KAIST researchers developed a novel ultra-low power memory for neuromorphic computing
A team of Korean researchers is making headlines by developing a new memory device that can be used to replace existing memory or used in implementing neuromorphic computing for next-generation artificial intelligence hardware for its low processing costs and its ultra-low power consumption. KAIST (President Kwang-Hyung Lee) announced on April 4th that Professor Shinhyun Choi's research team in the School of Electrical Engineering has developed a next-generation phase change memory* device featuring ultra-low-power consumption that can replace DRAM and NAND flash memory. ☞ Phase change memory: A memory device that stores and/or processes information by changing the crystalline states of materials to be amorphous or crystalline using heat, thereby changing its resistance state. Existing phase change memory has the problems such as expensive fabrication process for making highly scaled device and requiring substantial amount of power for operation. To solve these problems, Professor Choi’s research team developed an ultra-low power phase change memory device by electrically forming a very small nanometer (nm) scale phase changeable filament without expensive fabrication processes. This new development has the groundbreaking advantage of not only having a very low processing cost but also of enabling operating with ultra-low power consumption. DRAM, one of the most popularly used memory, is very fast, but has volatile characteristics in which data disappears when the power is turned off. NAND flash memory, a storage device, has relatively slow read/write speeds, but it has non-volatile characteristic that enables it to preserve the data even when the power is cut off. Phase change memory, on the other hand, combines the advantages of both DRAM and NAND flash memory, offering high speed and non-volatile characteristics. For this reason, phase change memory is being highlighted as the next-generation memory that can replace existing memory, and is being actively researched as a memory technology or neuromorphic computing technology that mimics the human brain. However, conventional phase change memory devices require a substantial amount of power to operate, making it difficult to make practical large-capacity memory products or realize a neuromorphic computing system. In order to maximize the thermal efficiency for memory device operation, previous research efforts focused on reducing the power consumption by shrinking the physical size of the device through the use of the state-of-the-art lithography technologies, but they were met with limitations in terms of practicality as the degree of improvement in power consumption was minimal whereas the cost and the difficulty of fabrication increased with each improvement. In order to solve the power consumption problem of phase change memory, Professor Shinhyun Choi’s research team created a method to electrically form phase change materials in extremely small area, successfully implementing an ultra-low-power phase change memory device that consumes 15 times less power than a conventional phase change memory device fabricated with the expensive lithography tool. < Figure 1. Illustrations of the ultra-low power phase change memory device developed through this study and the comparison of power consumption by the newly developed phase change memory device compared to conventional phase change memory devices. > Professor Shinhyun Choi expressed strong confidence in how this research will span out in the future in the new field of research saying, "The phase change memory device we have developed is significant as it offers a novel approach to solve the lingering problems in producing a memory device at a greatly improved manufacturing cost and energy efficiency. We expect the results of our study to become the foundation of future electronic engineering, enabling various applications including high-density three-dimensional vertical memory and neuromorphic computing systems as it opened up the possibilities to choose from a variety of materials.” He went on to add, “I would like to thank the National Research Foundation of Korea and the National NanoFab Center for supporting this research.” This study, in which See-On Park, a student of MS-PhD Integrated Program, and Seokman Hong, a doctoral student of the School of Electrical Engineering at KAIST, participated as first authors, was published on April 4 in the April issue of the renowned international academic journal Nature. (Paper title: Phase-Change Memory via a Phase-Changeable Self-Confined Nano-Filament) This research was conducted with support from the Next-Generation Intelligent Semiconductor Technology Development Project, PIM AI Semiconductor Core Technology Development (Device) Project, Excellent Emerging Research Program of the National Research Foundation of Korea, and the Semiconductor Process-based Nanomedical Devices Development Project of the National NanoFab Center.
2024.04.04
View 4409
KAIST Develops Healthcare Device Tracking Chronic Diabetic Wounds
A KAIST research team has developed an effective wireless system that monitors the wound healing process by tracking the spatiotemporal temperature changes and heat transfer characteristics of damaged areas such as diabetic wounds. On the 5th of March, KAIST (represented by President Kwang Hyung Lee) announced that the research team led by Professor Kyeongha Kwon from KAIST’s School of Electrical Engineering, in association with Chung-Ang University professor Hanjun Ryu, developed digital healthcare technology that tracks the wound healing process in real time, which allows appropriate treatments to be administered. < Figure 1. Schematic illustrations and diagrams of real-time wound monitoring systems. > The skin serves as a barrier protecting the body from harmful substances, therefore damage to the skin may cause severe health risks to patients in need of intensive care. Especially in the case of diabetic patients, chronic wounds are easily formed due to complications in normal blood circulation and the wound healing process. In the United States alone, hundreds of billions of dollars of medical costs stem from regenerating the skin from such wounds. While various methods exist to promote wound healing, personalized management is essential depending on the condition of each patient's wounds. Accordingly, the research team tracked the heating response within the wound by utilizing the differences in temperature between the damaged area and the surrounding healthy skin. They then measured heat transfer characteristics to observe moisture changes near the skin surface, ultimately establishing a basis for understanding the formation process of scar tissue. The team conducted experiments using diabetic mice models regarding the delay in wound healing under pathological conditions, and it was demonstrated that the collected data accurately tracks the wound healing process and the formation of scar tissue. To minimize the tissue damage that may occur in the process of removing the tracking device after healing, the system integrates biodegradable sensor modules capable of natural decomposition within the body. These biodegradable modules disintegrate within the body after use, thus reducing the risk of additional discomfort or tissue damage upon device removal. Furthermore, the device could one day be used for monitoring inside the wound area as there is no need for removal. Professor Kyeongha Kwon, who led the research, anticipates that continuous monitoring of wound temperature and heat transfer characteristics will enable medical professionals to more accurately assess the status of diabetic patients' wounds and provide appropriate treatment. He further predicted that the implementation of biodegradable sensors allows for the safe decomposition of the device after wound healing without the need for removal, making live monitoring possible not only in hospitals but also at home. The research team plans to integrate antimicrobial materials into this device, aiming to expand its technological capabilities to enable the observation and prevention of inflammatory responses, bacterial infections, and other complications. The goal is to provide a multi-purpose wound monitoring platform capable of real-time antimicrobial monitoring in hospitals or homes by detecting changes in temperature and heat transfer characteristics indicative of infection levels. < Image 1. Image of the bioresorbable temperature sensor > The results of this study were published on February 19th in the international journal Advanced Healthcare Materials and selected as the inside back cover article, titled "Materials and Device Designs for Wireless Monitoring of Temperature and Thermal Transport Properties of Wound Beds during Healing." This research was conducted with support from the Basic Research Program, the Regional Innovation Center Program, and the BK21 Program.
2024.03.11
View 3410
The World’s First Hacking-preventing Cryptographic Semiconductor Chip
With the dramatic increase in the amount of information exchanged between components or devices in the 5G/6G era, such as for the Internet of Things (IoT) and autonomous driving, hacking attacks are becoming more sophisticated. Consequently, enhancing security functions is essential for safely transmitting data between and among devices. On February 29th, a KAIST research team led by Professors Yang-gyu Choi and Seung-tak Ryu from the School of Electrical Engineering announced the successful development of the world's first security cryptographic semiconductor. The team has developed the Cryptoristor, a cryptographic transistor based on FinFET technology, produced through a 100% silicon-compatible process, for the first time in the world. Cryptoristor is a random number generator (RNG) with unparalleled characteristics, featuring a unique structure comprising a single transistor and a distinctive mechanism. In all security environments, including artificial intelligence, the most crucial element is the RNG. In the most commonly used security chip, the Advanced Encryption Standard (AES), the RNG is a core component, occupying approximately 75% of the total chip area and more than 85% of its energy consumption. Hence, there is an urgent need for the development of low-power/ultra-small RNGs suitable for mobile or IoT devices. Existing RNGs come with limitations as they lack compatibility with silicon CMOS processes and circuit-based RNGs occupy a large surface area. In contrast, the team’s newly developed Cryptoristor, a cryptographic semiconductor based on a single-component structure, consumes and occupies less than .001 of the power and area compared to the current chips being used. Utilizing the inherent randomness of FinFETs, fabricated on a Silicon-on-Insulator (SOI) substrate with an insulating layer formed beneath the silicon, the team developed an RNG that unpredictably produces zeroes and ones. < Figure 1. Conceptual diagram of the security cryptographic transistor device. > Generally speaking, preventing hackers from predicting the encrypted algorithms during data exchanges through mobile devices is pivotal. Therefore, this method ensures unpredictability by generating random sequences of zeroes and ones that change every time. Moreover, while the Cryptoristor-based RNG research is the world's first of its kind without any international implementation cases, it shares the same transistor structure as existing logic or memory components. This enables 100% production through rapid mass production processes using existing semiconductor facilities at a low cost. Seung-il Kim, a PhD student who led the research, explained the significance of the study, stating, "As a cryptographic semiconductor, the ultra-small/low-power random number generator enhances security through its distinctive unpredictability, supporting safe hyperconnectivity with secure transmissions between chips or devices. Particularly, compared to previous research, it offers excellent advantages in terms of energy consumption, integration density, and cost, making it suitable for IoT device environments." This research, with master’s student Hyung-jin Yoo as the co-author, was officially published in the online edition of Science Advances, a sister journal of Science, in February 2024 (research paper title: Cryptographic transistor for true random number generator with low power consumption). This research received support from the Next-Generation Intelligent Semiconductor Technology Development Project and the Core Technology Development Project for the National Semiconductor Research Laboratory.
2024.03.07
View 4152
KAIST to begin Joint Research to Develop Next-Generation LiDAR System with Hyundai Motor Group
< (From left) Jong-Soo Lee, Executive Vice President at Hyundai Motor, Sang-Yup Lee, Senior Vice President for Research at KAIST > The ‘Hyundai Motor Group-KAIST On-Chip LiDAR Joint Research Lab’ was opened at KAIST’s main campus in Daejeon to develop LiDAR sensors for advanced autonomous vehicles. The joint research lab aims to develop high-performance and compact on-chip sensors and new signal detection technology, which are essential in the increasingly competitive autonomous driving market. On-chip sensors, which utilize semiconductor manufacturing technology to add various functions, can reduce the size of LiDAR systems compared to conventional methods and secure price competitiveness through mass production using semiconductor fabrication processes. The joint research lab will consist of about 30 researchers, including the Hyundai-Kia Institute of Advanced Technology Development research team and KAIST professors Sanghyeon Kim, Sangsik Kim, Wanyeong Jung, and Hamza Kurt from KAIST’s School of Electrical Engineering, and will operate for four years until 2028. KAIST will be leading the specialized work of each research team, such as for the development of silicon optoelectronic on-chip LiDAR components, the fabrication of high-speed, high-power integrated circuits to run the LiDAR systems, and the optimization and verification of LiDAR systems. Hyundai Motor and Kia, together with Hyundai NGV, a specialized industry-academia cooperation institution, will oversee the operation of the joint research lab and provide support such as monitoring technological trends, suggesting research directions, deriving core ideas, and recommending technologies and experts to enhance research capabilities. A Hyundai Motor Group official said, "We believe that this cooperation between Hyundai Motor Company and Kia, the leader in autonomous driving technology, and KAIST, the home of world-class technology, will hasten the achievement of fully autonomous driving." He added, "We will do our best to enable the lab to produce tangible results.” Professor Sanghyeon Kim said, "The LiDAR sensor, which serves as the eyes of a car, is a core technology for future autonomous vehicle development that is essential for automobile companies to internalize."
2024.02.27
View 4409
KAIST Research Team Develops Sweat-Resistant Wearable Robot Sensor
New electromyography (EMG) sensor technology that allows the long-term stable control of wearable robots and is not affected by the wearer’s sweat and dead skin has gained attention recently. Wearable robots are devices used across a variety of rehabilitation treatments for the elderly and patients recovering from stroke or trauma. A joint research team led by Professor Jae-Woong Jung from the KAIST School of Electrical Engineering (EE) and Professor Jung Kim from the KAIST Department of Mechanical Engineering (ME) announced on January 23rd that they have successfully developed a stretchable and adhesive microneedle sensor that can electrically sense physiological signals at a high level without being affected by the state of the user’s skin. For wearable robots to recognize the intentions behind human movement for their use in rehabilitation treatment, they require a wearable electrophysiological sensor that gives precise EMG measurements. However, existing sensors often show deteriorating signal quality over time and are greatly affected by the user’s skin conditions. Furthermore, the sensor’s higher mechanical hardness causes noise since the contact surface is unable to keep up with the deformation of the skin. These shortcomings limit the reliable, long-term control of wearable robots. < Figure 1. Design and working concept of the Stretchable microNeedle Adhesive Patch (SNAP). (A) Schematic illustration showing the overall system configuration and application of SNAP. (B) Exploded view schematic diagram of a SNAP, consisting of stretchable serpentine interconnects, Au-coated Si microneedle, and ECA made of Ag flakes–silicone composite. (C) Optical images showing high mechanical compliance of SNAP. > However, the recently developed technology is expected to allow long-term and high-quality EMG measurements as it uses a stretchable and adhesive conducting substrate integrated with microneedle arrays that can easily penetrate the stratum corneum without causing discomfort. Through its excellent performance, the sensor is anticipated to be able to stably control wearable robots over a long period of time regardless of the wearer’s changing skin conditions and without the need for a preparation step that removes sweat and dead cells from the surface of their skin. The research team created a stretchable and adhesive microneedle sensor by integrating microneedles into a soft silicon polymer substrate. The hard microneedles penetrate through the stratum corneum, which has high electrical resistance. As a result, the sensor can effectively lower contact resistance with the skin and obtain high-quality electrophysiological signals regardless of contamination. At the same time, the soft and adhesive conducting substrate can adapt to the skin’s surface that stretches with the wearer’s movement, providing a comfortable fit and minimizing noise caused by movement. < Figure 2. Demonstration of the wireless Stretchable microNeedle Adhesive Patch (SNAP) system as an Human-machine interfaces (HMI) for closed-loop control of an exoskeleton robot. (A) Illustration depicting the system architecture and control strategy of an exoskeleton robot. (B) The hardware configuration of the pneumatic back support exoskeleton system. (C) Comparison of root mean square (RMS) of electromyography (EMG) with and without robotic assistance of pretreated skin and non-pretreated skin. > To verify the usability of the new patch, the research team conducted a motion assistance experiment using a wearable robot. They attached the microneedle patch on a user’s leg, where it could sense the electrical signals generated by the muscle. The sensor then sent the detected intention to a wearable robot, allowing the robot to help the wearer lift a heavy object more easily. Professor Jae-Woong Jung, who led the research, said, “The developed stretchable and adhesive microneedle sensor can stability detect EMG signals without being affected by the state of a user’s skin. Through this, we will be able to control wearable robots with higher precision and stability, which will help the rehabilitation of patients who use robots.” The results of this research, written by co-first authors Heesoo Kim and Juhyun Lee, who are both Ph.D. candidates in the KAIST School of EE, were published in Science Advances on January 17th under the title “Skin-preparation-free, stretchable microneedle adhesive patches for reliable electrophysiological sensing and exoskeleton robot control”. This research was supported by the Bio-signal Sensor Integrated Technology Development Project by the National Research Foundation of Korea, the Electronic Medicinal Technology Development Project, and the Step 4 BK21 Project.
2024.01.30
View 4102
KAIST and Hyundai Motors Collaborate to Develop Ultra-Fast Hydrogen Leak Detection within 0.6 Seconds
Recently, as the spread of eco-friendly hydrogen cars increases, the importance of hydrogen sensors is also on the rise. In particular, achieving technology to detect hydrogen leaks within one second remains a challenging task. Accordingly, the development of the world's first hydrogen sensor that meets the performance standards of the U.S. Department of Energy has become a hot topic. A team at KAIST led by Dr. Min-Seung Jo from Professor Jun-Bo Yoon's team in the Department of Electrical and Electronic Engineering has successfully achieved all of its desired performance indicators, meeting globally recognized standards through collaboration with the Electromagnetic Energy Materials Research Team at Hyundai Motor Company's Basic Materials Research Center and Professor Min-Ho Seo of Pusan National University. On January 10th, the research group announced that the world's first hydrogen sensor with a speed of less than 0.6 seconds had been developed. In order to secure faster and more stable hydrogen detection technology than existing commercialized hydrogen sensors, the KAIST team began developing a next-generation hydrogen sensor in 2021 together with Hyundai Motor Company, and succeeded after two years of development. < Figure 1. (Left) The conceptual drawing of the structure of the coplanar heater-integrated hydrogen sensor. Pd nanowire is stably suspended in the air even with its thickness of 20 nm. (Right) A graph of hydrogen sensor performance operating within 0.6 seconds for hydrogen at a concentration of 0.1 to 4% > Existing hydrogen sensor research has mainly focused on sensing materials, such as catalytic treatments or the alloying of palladium (Pd) materials, which are widely used in hydrogen sensors. Although these studies showed excellent performance with certain performance indicators, they did not meet all of the desired performance indicators and commercialization was limited due to the difficulty of batch processing. To overcome this, the research team developed a sensor that satisfied all of the performance indicators by combining independent micro/nano structure design and process technology based on pure palladium materials. In addition, considering future mass production, pure metal materials with fewer material restrictions were used rather than synthetic materials, and a next-generation hydrogen sensor was developed that can be mass-produced based on a semiconductor batch process. The developed device is a differential coplanar device in which the heater and sensing materials are integrated side by side on the same plane to overcome the uneven temperature distribution of existing gas sensors, which have a structure where the heater, insulating layer, and sensing materials are stacked vertically. The palladium nanomaterial, which is a sensing material, has a completely floating structure and is exposed to air from beneath, maximizing the reaction area with a gas to ensure a fast reaction speed. In addition, the palladium sensing material operates at a uniform temperature throughout the entire area, and the research team was able to secure a fast operation speed, wide sensing concentration, and temperature/humidity insensitivity by accurately controlling temperature-sensitive sensing performance. < Figure 2. Electron microscopy of the coplanar heater-integrated hydrogen sensor (left) Photo of the entire device (top right) Pd nanowire suspended in the air (bottom right) Cross section of Pd nanowire > The research team packaged the fabricated device with a Bluetooth module to create an integrated module that wirelessly detects hydrogen leaks within one second and then verified its performance. Unlike existing high-performance optical hydrogen sensors, this one is highly portable and can be used in a variety of applications where hydrogen energy is used. Dr. Min-Seung Jo, who led the research, said, “The results of this research are of significant value as they not only operate at high speeds by exceeding the performance limits of existing hydrogen sensors, but also secure the reliability and stability necessary for actual use, and can be used in various places such as automobiles, hydrogen charging stations, and homes.” He also revealed his future plans, saying, “Through the commercialization of this hydrogen sensor technology, I would like to contribute to advancing the safe and eco-friendly use of hydrogen energy.” < Figure 3. (Left) Real-time hydrogen detection results from the coplanar heater-integrated hydrogen sensor integrated and packaged in wireless communication and an app for mobile phone. (Middle) LED blinking cycle control in accordance with the hydrogen concentration level. (Right) Results of performance confirmation of the detection within 1 second in a real-time hydrogen leak demo > The research team is currently working with Hyundai Motor Company to manufacture the device on a wafer scale and then mount it on a vehicle module to further verify detection and durability performance. This research, conducted by Dr. Min-Seung Jo as the first author, has three patent applications filed in the U.S. and Korea, and was published in the renowned international academic journal 'ACS Nano'. (Paper title: Ultrafast (∼0.6 s), Robust, and Highly Linear Hydrogen Detection up to 10% Using Fully Suspended Pure Pd Nanowire). (Impact Factor: 18.087). ( https://pubs.acs.org/doi/10.1021/acsnano.3c06806?fig=fig1&ref=pdf ) The research was conducted through support from the National Research Foundation of Korea's Nano and Materials Technology Development Project and support and joint development efforts from Hyundai Motor Company's Basic Materials Research Center.
2024.01.25
View 3115
An intravenous needle that irreversibly softens via body temperature on insertion
- A joint research team at KAIST developed an intravenous (IV) needle that softens upon insertion, minimizing risk of damage to blood vessels and tissues. - Once used, it remains soft even at room temperature, preventing accidental needle stick injuries and unethical multiple use of needle. - A thin-film temperature sensor can be embedded with this needle, enabling real-time monitoring of the patient's core body temperature, or detection of unintended fluid leakage, during IV medication. Intravenous (IV) injection is a method commonly used in patient’s treatment worldwide as it induces rapid effects and allows treatment through continuous administration of medication by directly injecting drugs into the blood vessel. However, medical IV needles, made of hard materials such as stainless steel or plastic which do not mechanically match the soft biological tissues of the body, can cause critical problems in healthcare settings, starting from minor tissue damages in the injection sites to serious inflammations. The structure and dexterity of rigid medical IV devices also enable unethical reuse of needles for reduction of injection costs, leading to transmission of deadly blood-borne disease infections such as human immunodeficiency virus (HIV) and hepatitis B/C viruses. Furthermore, unintended needlestick injuries are frequently occurring in medical settings worldwide, that are viable sources of such infections, with IV needles having the greatest susceptibility of being the medium of transmissible diseases. For these reasons, the World Health Organization (WHO) in 2015 launched a policy on safe injection practices to encourage the development and use of “smart” syringes that have features to prevent re-use, after a tremendous increase in the number of deadly infectious disease worldwide due to medical-sharps related issues. KAIST announced on the 13th that Professor Jae-Woong Jeong and his research team of its School of Electrical Engineering succeeded in developing the Phase-Convertible, Adapting and non-REusable (P-CARE) needle with variable stiffness that can improve patient health and ensure the safety of medical staff through convergent joint research with another team led by Professor Won-Il Jeong of the Graduate School of Medical Sciences. The new technology is expected to allow patients to move without worrying about pain at the injection site as it reduces the risk of damage to the wall of the blood vessel as patients receive IV medication. This is possible with the needle’s stiffness-tunable characteristics which will make it soft and flexible upon insertion into the body due to increased temperature, adapting to the movement of thin-walled vein. It is also expected to prevent blood-borne disease infections caused by accidental needlestick injuries or unethical re-using of syringes as the deformed needle remains perpetually soft even after it is retracted from the injection site. The results of this research, in which Karen-Christian Agno, a doctoral researcher of the School of Electrical Engineering at and Dr. Keungmo Yang of the Graduate School of Medical Sciences participated as co-first authors, was published in Nature Biomedical Engineering on October 30. (Paper title: A temperature-responsive intravenous needle that irreversibly softens on insertion) < Figure 1. Disposable variable stiffness intravenous needle. (a) Conceptual illustration of the key features of the P-CARE needle whose mechanical properties can be changed by body temperature, (b) Photograph of commonly used IV access devices and the P-CARE needle, (c) Performance of common IV access devices and the P-CARE needle > “We’ve developed this special needle using advanced materials and micro/nano engineering techniques, and it can solve many global problems related to conventional medical needles used in healthcare worldwide”, said Jae-Woong Jeong, Ph.D., an associate professor of Electrical Engineering at KAIST and a lead senior author of the study. The softening IV needle created by the research team is made up of liquid metal gallium that forms the hollow, mechanical needle frame encapsulated within an ultra-soft silicone material. In its solid state, gallium has sufficient hardness that enables puncturing of soft biological tissues. However, gallium melts when it is exposed to body temperature upon insertion, and changes it into a soft state like the surrounding tissue, enabling stable delivery of the drug without damaging blood vessels. Once used, a needle remains soft even at room temperature due to the supercooling phenomenon of gallium, fundamentally preventing needlestick accidents and reuse problems. Biocompatibility of the softening IV needle was validated through in vivo studies in mice. The studies showed that implanted needles caused significantly less inflammation relative to the standard IV access devices of similar size made of metal needles or plastic catheters. The study also confirmed the new needle was able to deliver medications as reliably as commercial injection needles. < Photo 1. Photo of the P-CARE needle that softens with body temperature. > Researchers also showed possibility of integrating a customized ultra-thin temperature sensor with the softening IV needle to measure the on-site temperature which can further enhance patient’s well-being. The single assembly of sensor-needle device can be used to monitor the core body temperature, or even detect if there is a fluid leakage on-site during indwelling use, eliminating the need for additional medical tools or procedures to provide the patients with better health care services. The researchers believe that this transformative IV needle can open new opportunities for wide range of applications particularly in clinical setups, in terms of redesigning other medical needles and sharp medical tools to reduce muscle tissue injury during indwelling use. The softening IV needle may become even more valuable in the present times as there is an estimated 16 billion medical injections administered annually in a global scale, yet not all needles are disposed of properly, based on a 2018 WHO report. < Figure 2. Biocompatibility test for P-CARE needle: Images of H&E stained histology (the area inside the dashed box on the left is provided in an expanded view in the right), TUNEL staining (green), DAPI staining of nuclei (blue) and co-staining (TUNEL and DAPI) of muscle tissue from different organs. > < Figure 3. Conceptual images of potential utilization for temperature monitoring function of P-CARE needle integrated with a temperature sensor. > (a) Schematic diagram of injecting a drug through intravenous injection into the abdomen of a laboratory mouse (b) Change of body temperature upon injection of drug (c) Conceptual illustration of normal intravenous drug injection (top) and fluid leakage (bottom) (d) Comparison of body temperature during normal drug injection and fluid leakage: when the fluid leak occur due to incorrect insertion, a sudden drop of temperature is detected. This work was supported by grants from the National Research Foundation of Korea (NRF) funded by the Ministry of Science and ICT.
2023.11.13
View 5756
KAIST Research Team Develops World’s First Humanoid Pilot, PIBOT
In the Spring of last year, the legendary, fictional pilot “Maverick” flew his plane in the film “Top Gun: Maverick” that drew crowds to theatres around the world. This year, the appearance of a humanoid pilot, PIBOT, has stolen the spotlight at KAIST. < Photo 1. Humanoid pilot robot, PIBOT > A KAIST research team has developed a humanoid robot that can understand manuals written in natural language and fly a plane on its own. The team also announced their plans to commercialize the humanoid pilot. < Photo 2. PIBOT on flight simulator (view from above) > The project was led by KAIST Professor David Hyunchul Shim, and was conducted as a joint research project with Professors Jaegul Choo, Kuk-Jin Yoon, and Min Jun Kim. The study was supported by Future Challenge Funding under the project title, “Development of Human-like Pilot Robot based on Natural Language Processing”. The team utilized AI and robotics technologies, and demonstrated that the humanoid could sit itself in a real cockpit and operate the various pieces of equipment without modifying any part of the aircraft. This is a fundamental difference that distinguishes this technology from existing autopilot functions or unmanned aircrafts. < Photo 3. PIBOT operating a flight simulator (side) > The KAIST team’s humanoid pilot is still under development but it can already remember Jeppeson charts from all around the world, which is impossible for human pilots to do, and fly without error. In particular, it can make use of recent ChatGPT technology to remember the full Quick Reference Handbook (QRF) and respond immediately to various situations, as well as calculate safe routes in real time based on the flight status of the aircraft, with emergency response times quicker than human pilots. Furthermore, while existing robots usually carry out repeated motions in a fixed position, PIBOT can analyze the state of the cockpit as well as the situation outside the aircraft using an embedded camera. PIBOT can accurately control the various switches in the cockpit and, using high-precision control technology, it can accurately control its robotic arms and hands even during harsh turbulence. < Photo 4. PIBOT on-board KLA-100, Korea’s first light aircraft > The humanoid pilot is currently capable of carrying out all operations from starting the aircraft to taxiing, takeoff and landing, cruising, and cycling using a flight control simulator. The research team plans to use the humanoid pilot to fly a real-life light aircraft to verify its abilities. Prof. Shim explained, “Humanoid pilot robots do not require the modification of existing aircrafts and can be applied immediately to automated flights. They are therefore highly applicable and practical. We expect them to be applied into various other vehicles like cars and military trucks since they can control a wide range of equipment. They will particularly be particularly helpful in situations where military resources are severely depleted.” This research was supported by Future Challenge Funding (total: 5.7 bn KRW) from the Agency for Defense Development. The project started in 2022 as a joint research project by Prof. David Hyunchul Shim (chief of research) from the KAIST School of Electrical Engineering (EE), Prof. Jaegul Choo from the Kim Jaechul Graduate School of AI at KAIST, Prof. Kuk-Jin Yoon from the KAIST Department of Mechanical Engineering, and Prof. Min Jun Kim from the KAIST School of EE. The project is to be completed by 2026 and the involved researchers are also considering commercialization strategies for both military and civil use.
2023.08.03
View 9963
<<
첫번째페이지
<
이전 페이지
1
2
3
4
5
6
7
8
9
10
>
다음 페이지
>>
마지막 페이지 12