Secret to Drug Addiction Relapse Found: Brain's Addiction Circuit Identified
<(From Left) Dr. Minju Jeong,(UCSD), Prof. Byung Kook Lim (UCSD), Prof. Se-Bum Paik (KAIST)>
Drug addiction carries an extremely high risk of relapse, as cravings can be reignited by minor stimuli even long after one has stopped using. Previously, this phenomenon was attributed to a decline in the function of the prefrontal cortex (PFC), which regulates impulses. However, a joint international research team has recently revealed that the cause of addiction relapse is not a simple decline in brain function, but rather an imbalance in specific neural circuits.
KAIST announced on March 9th that a research team led by Prof. Se-Bum Paik from the Department of Brain and Cognitive Sciences and Prof. Byung Kook Lim from the University of California, San Diego (UCSD) has identified the core principle by which specific inhibitory neurons in the prefrontal cortex regulate cocaine-seeking behavior.
In particular, the research team focused on parvalbumin-positive (PV) inhibitory neurons, which regulate the balance of neural signals by suppressing the activity of other neurons in the brain. They confirmed that these cells act as a "brake gate" that controls excitatory signals in the brain and serve as a crucial factor in determining drug-seeking behavior that emerges after withdrawal.
The prefrontal cortex (PFC) of our brain can properly perform its "braking" function to suppress impulses when excitatory and inhibitory signals are in balance. To investigate how chronic drug exposure disrupts this balance, the research team conducted cocaine administration experiments on mice. During this process, they tracked when inhibitory neurons in the PFC were activated and how they sent signals to downstream brain regions.
The experimental results showed that parvalbumin (PV) cells, which account for about 60-70% of the inhibitory neurons in the PFC, were highly active when the mice attempted to seek cocaine. However, when "extinction training"—training to stop seeking the drug—was conducted, the activity of these cells significantly decreased. This demonstrates that the activity patterns of PV cells are not permanently fixed by addiction but can be readjusted through the extinction process.
<Figure 1. Experimental design illustrating cocaine self-administration and longitudinal tracking of prefrontal cortical neural activity during cocaine-seeking behavior>
The research team confirmed that artificially suppressing PV cell activity significantly reduced cocaine-seeking behavior in mice. Conversely, activating these cells caused the drug-seeking behavior to persist even after the extinction process. This effect was specifically observed in drug-addiction behavior and did not appear with general rewards like sugar water. Furthermore, this phenomenon was not observed in somatostatin (SOM) cells—another type of inhibitory neuron—indicating that PV cells selectively regulate drug addiction behavior.
<Figure 2. Comparison of single-neuron activity, population activity patterns, and behavioral modulation of prefrontal inhibitory neurons across different stages of cocaine-seeking behavior>
The team also identified the specific brain circuit through which these PV cells operate. Signals originating from the prefrontal cortex are transmitted to the reward circuit of the Ventral Tegmental Area (VTA), a key brain region related to reward. This pathway emerged as the central channel for regulating addiction behavior, determining whether or not to seek the drug again. In this process, PV neurons act as a "regulatory switch," controlling the flow of signals to influence dopamine signaling and deciding whether to maintain or suppress addictive behavior.
In short, the study revealed that addiction relapse is not due to an overall functional decline of the prefrontal cortex, but is determined by whether PV neurons regulate the neural pathway connecting the PFC to the reward circuit.
<Figure 3. Schematic illustrating the prefrontal–reward circuit mechanism that determines drug-seeking behavior>
Prof. Se-Bum Paik stated, "This research shows that drug addiction is a circuit-level problem arising from a collapse in the regulatory balance of specific neurons and downstream neural circuits. The discovery that parvalbumin (PV) cells act as a 'gate' for addictive behavior will provide a crucial lead for developing precision-targeted treatment strategies in the future."
This study was led by Dr. Minju Jeong (UCSD) as the first author, with Prof. Byung Kook Lim (UCSD) and Prof. Se-Bum Paik (KAIST) serving as co-corresponding authors. The findings were published online on February 26 in Neuron, a premier journal in the field of neuroscience.
Paper Title: Distinct Interneuronal Dynamics Selectively Gate Target-Specific Cortical Projections in Drug Seeking
DOI: 10.1016/j.neuron.2026.01.002
Full Author List: Minju Jeong, Seungdae Baek, Qingdi Wang, Li Yao, Eun Ji Lee, Arturo Marroquin Rivera, Joann Jocelynn Lee, Hyeonseok Jang, Dhananjay Bambah-Mukku, Christine Hyun-Seung Mun, Tyler Boesen, Sumit Nanda, Cheol Ryong Ku, Hong-wei Dong, Benoit Labonté, Se-Bum Paik, and Byung Kook Lim.
This research was conducted with the support of the Basic Research Program in Science and Engineering of the National Research Foundation of Korea.
KAIST’s Reliability-Aware AI Opens Path to Faster Cathode Design and Next-Generation Batteries
< (From front left) Professor Seungbum Hong, Professor EunAe Cho (From back left) Chaeyul Kang, Benediktus Madika, Jung Hyeon Moon, Taemin Park (Top) JooSung Shim >
The power that makes electric vehicles travel further and smartphones last longer comes from battery materials. Among them, the core material that directly determines the performance and lifespan of a battery is the cathode material. What if artificial intelligence could replace the numerous experiments required for battery material development? KAIST's research team has developed an artificial intelligence (AI) framework that presents both the particle size of cathode materials and prediction reliability even in situations where experimental data is insufficient, opening the possibility of expansion to next-generation energy technologies such as all-solid-state batteries.
KAIST announced on January 26th that a research team led by Professor Seungbum Hong of the Department of Materials Science and Engineering, in joint research with Professor EunAe Cho's team, has developed a machine learning framework that accurately predicts the particle size of battery cathode materials even when experimental data is incomplete and provides the degree of reliability of the results.
The cathode material inside the battery is the core material that allows lithium-ion batteries to store and use energy. Currently, the most widely used cathode material for electric vehicle batteries is an NCM-based metal oxide mixed with nickel (Ni), cobalt (Co), and manganese (Mn), which greatly affects the battery's lifespan, charging speed, driving range, and safety.
KAIST research team focused on the fact that the size of the very small primary particles that make up these cathode materials is a key factor in determining battery performance. This is because if the particles are too large, performance deteriorates, and conversely, if they are too small, stability problems may occur. Accordingly, the research team developed an AI-based technology that can accurately predict and control particle size.
< Battery performance prediction related (AI-generated image) >
In the past, to determine the particle size, numerous experiments had to be repeated while changing the sintering temperature, time, and material composition. However, in actual research fields, it was difficult to measure all conditions without omission, and experimental data were often missing, which limited the precise analysis of the relationship between process conditions and particle size.
To solve this problem, the research team designed an AI framework that supplements missing data and presents prediction results along with reliability. This framework is characterized by combining a technology (MatImpute) that supplements missing experimental data by considering chemical characteristics and a probabilistic machine learning model (NGBoost) that calculates prediction uncertainty.
This AI model does not stop at simply predicting particle size but also provides information on the extent to which the prediction can be trusted. This serves as an important criterion for deciding under what conditions to actually synthesize materials.
As a result of learning by expanding experimental data, the AI model showed a high prediction accuracy of about 86.6%. According to the analysis, it was found that the cathode material particle size is more significantly affected by process conditions such as baking temperature and time than by material components, which aligns well with existing experimental understanding.
To verify the reliability of the AI prediction, the research team conducted an experiment by newly producing four types of cathode material samples synthesized under manufacturing conditions not included in the existing data while maintaining the same metal component ratio of NCM811 (Ni 80% / Co 10% / Mn 10%) composition. As a result, the particle size predicted by the AI almost matched the actual microscopic measurement results, and most of the errors were 0.13 micrometers (μm) or less, which is much smaller than the thickness of a human hair. In particular, the actual experimental results were included within the prediction uncertainty range presented by the AI, confirming that not only the predicted value but also its reliability was valid.
< Distribution shift condition experiment verification using 4 types of samples >
This study is significant in that it has opened a way to find conditions with a high probability of success first without performing all experiments in battery research. Through this, it is expected to speed up the development of battery materials and significantly reduce unnecessary experiments and costs.
Professor Seungbum Hong said, "The key is that the AI presents not only the predicted value but also how much the result can be trusted," and added, "It will be of practical help in designing next-generation battery materials more quickly and efficiently."
In this study, Benediktus Madika, a doctoral student in the Department of Materials Science and Engineering, participated as the first author, and it was published on October 8, 2025, in 'Advanced Science', an internationally prestigious academic journal in the field of materials science and chemical engineering.
※ Paper Title: Uncertainty-Quantified Primary Particle Size Prediction in Li-Rich NCM Materials via Machine Learning and Chemistry-Aware Imputation, DOI: https://doi.org/10.1002/advs.202515694
Meanwhile, this research was conducted by researchers Benediktus Madika, Chaeyul Kang, JooSung Shim, Taemin Park, Jung Hyeon Moon, and the research team of Professor EunAe Cho and Professor Seungbum Hong, and was conducted with support from the Ministry of Science and ICT (MSIT) National Research Foundation of Korea (NRF) Future Convergence Technology Pioneer (Strategic) (Project No. RS-2023-00247245).
< Battery performance prediction (AI-generated image) >
Robot Valley Project Activation of the Korean style Robot and AI Startup Ecosystem Fully Underway
< From left: Top Excellence Award winner Robolight (Pre-startup Founder Han-seol Choi), Top Excellence Award winner Coils (CEO Seong-ryeol Heo), Professor Jung Kim of KAIST, Grand Prize winner Noman (CEO Jung-wook Moon), Professor Kyoungchul Kong of KAIST, CEO Dae-hee Park of Daejeon Creative Economy Innovation Center, Excellence Award winner Gigaflops (CEO Min-tae Kim), Excellence Award winner BLUE APEX (Pre-startup Founder Na-hyeon Kwon) >
KAIST announced on December 10th that KAIST Holdings (CEO Hyeonmin Bae), a specialized technology commercialization investment institution, successfully held the '2025 KAIST Hu-Robotics Startup Cup' on the 9th at the main building of Daejeon Startup Park. This was held as part of the Robot Valley Project, aiming to discover and foster promising startup teams in the robotics field and establish a robot scale-up ecosystem based on a technology platform.
This competition was conducted as a core program of the Robot Valley Project (Deep-Tech Scale-up Valley Fostering Project), which is promoted by the Ministry of Science and ICT and supported by Daejeon Metropolitan City. The competition proceeded through a meet-up day with KAIST Mechanical Engineering researchers, robotics companies like Angel Robotics and Twinny, and startup experts such as Bluepoint, leading to the final round. Throughout this process, a support system for the scale-up of robot startups was established, linking technology verification, strengthening entrepreneurial capabilities, and investment linkage.
KAIST Holdings and the Deep-Tech Valley Project Group (hereinafter referred to as the Project Group) stated that this competition marks the beginning of 'establishing a Korean-style Robot and AI startup ecosystem.' Their goal through the Robot Valley Project is to create a Korean-style robot scale-up ecosystem centered around Daejeon and KAIST, and furthermore, to build a technology circulation structure utilizing verified technology platforms.
KAIST has produced successful scale-up cases in the robotics field, such as Rainbow Robotics and Angel Robotics. However, the recent robotics industry has seen a rapid increase in technological difficulty due to the convergence of mechanical engineering, AI, and control software, creating structural limitations for early-stage founders to challenge alone.
To solve this, the Project Group proposed the 'Scale-up Valley Construction Strategy,' which opens up the verified technologies of established senior companies to junior founders. This strategy focuses on supporting startups to concentrate on developing market-ready robot services and applications on top of verified technology platforms, rather than consuming excessive time on developing basic hardware like motors and controllers.
The Angel Robotics technology platform, presented as the core underlying technology of this strategy, consists of actuators, control modules, and core software. KAIST plans to gradually open up these foundational technologies for use by early-stage startup teams.
The Project Group emphasized that enabling startup teams to utilize such technology platforms from the initial stage is the core infrastructure for accelerating the Korean-style robot startup ecosystem.
A total of 21 teams participated in this competition, including pre-startup founders (Track A) and early-stage startups established within 3 years (Track B), all possessing human-centered robotics technology and convergence business models.
After fierce preliminaries, 8 teams advanced to the final round, and a total of 5 teams were finally selected: one Grand Prize winner, two Choi Woo-sung (Top Excellence Award) winners, and two Excellence Award winners.
The Grand Prize was awarded to 'Noman' for proposing an integrated system for a strawberry farm work robot and a rotating vertical cultivation module.
The Woo-sung Choi (Top Excellence Award) went to 'Robolight' and 'Coils.'
The Excellence Award was awarded to BLUE APEX and Gigaflops.
Professor Jung Kim, Head of the KAIST Mechanical Engineering Department and General Manager of the Robot Valley Project, said, "This competition has become the starting point for discovering future robot unicorns. For the next three years, we will continue to provide practical support for the growth of robot startups, and KAIST will play a leading role in building and expanding the deep-tech robot ecosystem centered in Daejeon."
< Group Photo of Award Winners >
Meanwhile, this competition was jointly hosted and organized by the Ministry of Science and ICT, Daejeon Metropolitan City, and the Research and Business Development Special Zone Foundation, as well as startup support organizations including KAIST, KAIST Holdings, Daejeon Technopark, and Daejeon Creative Economy Innovation Center.
Physics Informed AI Excels at Large Scale Discovery of New Materials!
<(From left) Ph.D candidates Songho Lee, Donggeun Park, and Hyeonbin Moon, and Professor Seunghwa Ryu from the Department of Mechanical Engineering; (top) Professor Jae Hyuk Lim from Kyung Hee University and Dr. Wabi Demeke from KAIST>
One of the key steps in developing new materials is “property identification,” which has long relied on massive amounts of experimental data and expensive equipment, limiting research efficiency. A KAIST research team has introduced a new technique that combines “physical laws,” which govern deformation and interaction of materials and energy, with artificial intelligence. This approach allows for rapid exploration of new materials even under data-scarce conditions and provides a foundation for accelerating design and verification across multiple engineering fields, including materials, mechanics, energy, and electronics.
KAIST (President Kwang Hyung Lee) announced on the 2nd of October that Professor Seunghwa Ryu’s research group in the Department of Mechanical Engineering, in collaboration with Professor Jae Hyuk Lim’s group at Kyung Hee University (President Jinsang Kim) and Dr. Byungki Ryu at the Korea Electrotechnology Research Institute (President Namkyun Kim), proposed a new method that can accurately determine material properties with only limited data. The method uses Physics-Informed Machine Learning (PIML), which directly incorporates physical laws into the AI learning process.
<Schematic Diagram of a Physics-Based Machine Learning Methodology for Understanding Material Properties>
In the first study, the researchers focused on hyperelastic materials, such as rubber. They presented a Physics-Informed Neural Network (PINN) method that can identify both the deformation behavior and the properties of materials using only a small amount of data obtained from a single experiment. Whereas previous approaches required large, complex datasets, this research demonstrated that material characteristics can be reliably reproduced even when data is scarce, limited, or noisy.
In the second study, the team turned to thermoelectric materials—new materials that convert heat into electricity and electricity into heat. They proposed a PINN-based inverse inference technique that can estimate key indicators, such as thermal conductivity (how well heat is transferred) and the Seebeck coefficient (how efficiently electricity is generated), from just a few measurements.
Going further, the researchers introduced a Physics-Informed Neural Operator (PINO), an AI model that understands the physical laws of nature, and showed that it can generalize to previously unseen materials without requiring retraining.
In fact, after training the system on 20 materials, they tested it on 60 entirely new materials, and in all cases it predicted their properties with high accuracy. This breakthrough points to a future where large-scale, high-speed screening of countless candidate materials becomes possible.
This achievement goes beyond simply reducing the need for experiments. By intricately combining physical laws with AI, the researchers provided the first example of improving experimental efficiency while preserving reliability.
Professor Seunghwa Ryu, who led both studies, stated, “This is the first case of applying AI that understands physical laws to real material research. It enables reliable identification of material properties even when data availability is limited, and it is expected to expand into various engineering fields.”
The first paper, co-first-authored by KAIST Mechanical Engineering PhD candidates Hyeonbin Moon and Donggeun Park, was published on August 13 in Computer Methods in Applied Mechanics and Engineering.
※ Paper title: “Physics-informed neural network-based discovery of hyperelastic constitutive models from extremely scarce data”
※ DOI: https://doi.org/10.1016/j.cma.2025.118258
The second paper, co-first-authored by KAIST Mechanical Engineering PhD candidates Hyeonbin Moon and Songho Lee, and Dr. Wabi Demeke, was published on August 22 in npj Computational Materials.
※ Paper title: “Physics-informed neural operators for generalizable and label-free inference of temperature-dependent thermoelectric properties” ※ DOI: https://doi.org/10.1038/s41524-025-01769-1
Meanwhile, the first study was supported by the Korea Research Foundation and the Ministry of Science and ICT’s INNOCore Program, as well as by a research project from the Ministry of Food and Drug Safety. The second study was carried out with support from the Korea Research Foundation and the Ministry of Science and ICT’s INNOCore Program.
'Team Atlanta', in which KAIST Professor Insu Yun research team participated, won the DARPA AI Cyber Challenge in the US, with a prize of 5.5 billion KRW
<Photo1. Group Photo of Team Atlanta>
Team Atlanta, led by Professor Insu Yun of the Department of Electrical and Electronic Engineering at KAIST and Tae-soo Kim, an executive from Samsung Research, along with researchers from POSTECH and Georgia Tech, won the final championship at the AI Cyber Challenge (AIxCC) hosted by the Defense Advanced Research Projects Agency (DARPA). The final was held at the world's largest hacking conference, DEF CON 33, in Las Vegas on August 8 (local time).
With this achievement, the team won a prize of $4 million (approximately 5.5 billion KRW), demonstrating the excellence of their AI-based autonomous cyber defense technology on the global stage.
<Photo2.Championship Commemorative:On the left and right are tournament officials. From the second person, Professor Tae-soo Kim(Samsung Research / Georgia Tech), Researcher Hyeong-seok Han (Samsung Research America), and Professor Insu Yun (KAIST)>
The AI Cyber Challenge is a two-year global competition co-hosted by DARPA and the Advanced Research Projects Agency for Health (ARPA-H). It challenges contestants to automatically analyze, detect, and fix software vulnerabilities using AI-based Cyber Reasoning Systems (CRS). The total prize money for the competition is $29.5 million, with the winning team receiving $4 million.
In the final, Team Atlanta scored a total of 392.76 points, a difference of over 170 points from the second-place team, Trail of Bits, securing a dominant victory. The CRS developed by Team Atlanta successfully and automatically detected various types of vulnerabilities and patched a significant number of them in real time.
Among the 7 finalist teams, an average of 77% of the 70 intentionally injected vulnerabilities were found, and 61% of them were patched. The teams also found 18 additional unknown vulnerabilities in real software, proving the potential of AI security technology.
All CRS technologies, including those of the winning team, will be provided as open-source and are expected to be used to strengthen the security of core infrastructure such as hospitals, water, and power systems.
<Photo3. Final Scoreboard: An overwhelming victory with over 170 points>
Professor Insu Yun of KAIST, a member of Team Atlanta, stated, "I am very happy to have achieved such a great result. This is a remarkable achievement that shows Korea's cyber security research has reached the highest level in the world, and it was meaningful to show the capabilities of Korean researchers on the world stage. I will continue to conduct research to protect the digital safety of the nation and global society through the fusion of AI and security technology."
KAIST President Kwang-hyung Lee stated, "This victory is another example that proves KAIST is a world-leading institution in the field of future cyber security and AI convergence. We will continue to provide full support to our researchers so they can compete and produce results on the world stage."
<Photo4. Results Announcement>
KAIST Presents a Breakthrough in Overcoming Drug Resistance in Cancer – Hope for Treating Intractable Diseases like Diabetes
<(From the left) Prof. Hyun Uk Kim, Ph.D candiate Hae Deok Jung, Ph.D candidate Jina Lim, Prof.Yoosik Kim from the Department of Chemical and Biomolecular Engineering>
One of the biggest obstacles in cancer treatment is drug resistance in cancer cells. Conventional efforts have focused on identifying new drug targets to eliminate these resistant cells, but such approaches can often lead to even stronger resistance. Now, researchers at KAIST have developed a computational framework to predict key metabolic genes that can re-sensitize resistant cancer cells to treatment. This technique holds promise not only for a variety of cancer therapies but also for treating metabolic diseases such as diabetes.
On the 7th of July, KAIST (President Kwang Hyung Lee) announced that a research team led by Professors Hyun Uk Kim and Yoosik Kim from the Department of Chemical and Biomolecular Engineering had developed a computational framework that predicts metabolic gene targets to re-sensitize drug-resistant breast cancer cells. This was achieved using a metabolic network model capable of simulating human metabolism.
Focusing on metabolic alterations—key characteristics in the formation of drug resistance—the researchers developed a metabolism-based approach to identify gene targets that could enhance drug responsiveness by regulating the metabolism of drug-resistant breast cancer cells.
< Computational framework that can identify metabolic gene targets to revert the metabolic state of the drug-resistant cells to that of the drug-sensitive parental cells>
The team first constructed cell-specific metabolic network models by integrating proteomic data obtained from two different types of drug-resistant MCF7 breast cancer cell lines: one resistant to doxorubicin and the other to paclitaxel. They then performed gene knockout simulations* on all of the metabolic genes and analyzed the results.
*Gene knockout simulation: A computational method to predict changes in a biological network by virtually removing specific genes.
As a result, they discovered that suppressing certain genes could make previously resistant cancer cells responsive to anticancer drugs again. Specifically, they identified GOT1 as a target in doxorubicin-resistant cells, GPI in paclitaxel-resistant cells, and SLC1A5 as a common target for both drugs.
The predictions were experimentally validated by suppressing proteins encoded by these genes, which led to the re-sensitization of the drug-resistant cancer cells.
Furthermore, consistent re-sensitization effects were also observed when the same proteins were inhibited in other types of breast cancer cells that had developed resistance to the same drugs.
Professor Yoosik Kim remarked, “Cellular metabolism plays a crucial role in various intractable diseases including infectious and degenerative conditions. This new technology, which predicts metabolic regulation switches, can serve as a foundational tool not only for treating drug-resistant breast cancer but also for a wide range of diseases that currently lack effective therapies.”
Professor Hyun Uk Kim, who led the study, emphasized, “The significance of this research lies in our ability to accurately predict key metabolic genes that can make resistant cancer cells responsive to treatment again—using only computer simulations and minimal experimental data. This framework can be widely applied to discover new therapeutic targets in various cancers and metabolic diseases.”
The study, in which Ph.D. candidates JinA Lim and Hae Deok Jung from KAIST participated as co-first authors, was published online on June 25 in Proceedings of the National Academy of Sciences (PNAS), a leading multidisciplinary journal that covers top-tier research in life sciences, physics, engineering, and social sciences.
※ Title: Genome-scale knockout simulation and clustering analysis of drug-resistant breast cancer cells reveal drug sensitization targets ※ DOI: https://doi.org/10.1073/pnas.2425384122 ※ Authors: JinA Lim (KAIST, co-first author), Hae Deok Jung (KAIST, co-first author), Han Suk Ryu (Seoul National University Hospital, corresponding author), Yoosik Kim (KAIST, corresponding author), Hyun Uk Kim (KAIST, corresponding author), and five others.
This research was supported by the Ministry of Science and ICT through the National Research Foundation of Korea, and the Electronics and Telecommunications Research Institute (ETRI).
KAIST Presents Game-Changing Technology for Intractable Brain Disease Treatment Using Micro OLEDs
<(From left)Professor Kyung Cheol Choi, Professor Hyunjoo J. Lee, Dr. Somin Lee from the School of Electrical Engineering>
Optogenetics is a technique that controls neural activity by stimulating neurons expressing light-sensitive proteins with specific wavelengths of light. It has opened new possibilities for identifying causes of brain disorders and developing treatments for intractable neurological diseases. Because this technology requires precise stimulation inside the human brain with minimal damage to soft brain tissue, it must be integrated into a neural probe—a medical device implanted in the brain. KAIST researchers have now proposed a new paradigm for neural probes by integrating micro OLEDs into thin, flexible, implantable medical devices.
KAIST (President Kwang Hyung Lee) announced on the 6th of July that professor Kyung Cheol Choi and professor Hyunjoo J. Lee from the School of Electrical Engineering have jointly succeeded in developing an optogenetic neural probe integrated with flexible micro OLEDs.
Optical fibers have been used for decades in optogenetic research to deliver light to deep brain regions from external light sources. Recently, research has focused on flexible optical fibers and ultra-miniaturized neural probes that integrate light sources for single-neuron stimulation.
The research team focused on micro OLEDs due to their high spatial resolution and flexibility, which allow for precise light delivery to small areas of neurons. This enables detailed brain circuit analysis while minimizing side effects and avoiding restrictions on animal movement. Moreover, micro OLEDs offer precise control of light wavelengths and support multi-site stimulation, making them suitable for studying complex brain functions.
However, the device's electrical properties degrade easily in the presence of moisture or water, which limited their use as implantable bioelectronics. Furthermore, optimizing the high-resolution integration process on thin, flexible probes remained a challenge.
To address this, the team enhanced the operational reliability of OLEDs in moist, oxygen-rich environments and minimized tissue damage during implantation. They patterned an ultrathin, flexible encapsulation layer* composed of aluminum oxide and parylene-C (Al₂O₃/parylene-C) at widths of 260–600 micrometers (μm) to maintain biocompatibility.
*Encapsulation layer: A barrier that completely blocks oxygen and water molecules from the external environment, ensuring the longevity and reliability of the device.
When integrating the high-resolution micro OLEDs, the researchers also used parylene-C, the same biocompatible material as the encapsulation layer, to maintain flexibility and safety. To eliminate electrical interference between adjacent OLED pixels and spatially separate them, they introduced a pixel define layer (PDL), enabling the independent operation of eight micro OLEDs.
Furthermore, they precisely controlled the residual stress and thickness in the multilayer film structure of the device, ensuring its flexibility even in biological environments. This optimization allowed for probe insertion without bending or external shuttles or needles, minimizing mechanical stress during implantation.
KAIST to Develop a Korean-style ChatGPT Platform Specifically Geared Toward Medical Diagnosis and Drug Discovery
On May 23rd, KAIST (President Kwang-Hyung Lee) announced that its Digital Bio-Health AI Research Center (Director: Professor JongChul Ye of KAIST Kim Jaechul Graduate School of AI) has been selected for the Ministry of Science and ICT's 'AI Top-Tier Young Researcher Support Program (AI Star Fellowship Project).' With a total investment of ₩11.5 billion from May 2025 to December 2030, the center will embark on the full-scale development of AI technology and a platform capable of independently inferring and determining the kinds of diseases, and discovering new drugs.
< Photo. On May 20th, a kick-off meeting for the AI Star Fellowship Project was held at KAIST Kim Jaechul Graduate School of AI’s Yangjae Research Center with the KAIST research team and participating organizations of Samsung Medical Center, NAVER Cloud, and HITS. [From left to right in the front row] Professor Jaegul Joo (KAIST), Professor Yoonjae Choi (KAIST), Professor Woo Youn Kim (KAIST/HITS), Professor JongChul Ye (KAIST), Professor Sungsoo Ahn (KAIST), Dr. Haanju Yoo (NAVER Cloud), Yoonho Lee (KAIST), HyeYoon Moon (Samsung Medical Center), Dr. Su Min Kim (Samsung Medical Center) >
This project aims to foster an innovative AI research ecosystem centered on young researchers and develop an inferential AI agent that can utilize and automatically expand specialized knowledge systems in the bio and medical fields.
Professor JongChul Ye of the Kim Jaechul Graduate School of AI will serve as the lead researcher, with young researchers from KAIST including Professors Yoonjae Choi, Kimin Lee, Sungsoo Ahn, and Chanyoung Park, along with mid-career researchers like Professors Jaegul Joo and Woo Youn Kim, jointly undertaking the project. They will collaborate with various laboratories within KAIST to conduct comprehensive research covering the entire cycle from the theoretical foundations of AI inference to its practical application.
Specifically, the main goals include: - Building high-performance inference models that integrate diverse medical knowledge systems to enhance the precision and reliability of diagnosis and treatment. - Developing a convergence inference platform that efficiently combines symbol-based inference with neural network models. - Securing AI technology for new drug development and biomarker discovery based on 'cell ontology.'
Furthermore, through close collaboration with industry and medical institutions such as Samsung Medical Center, NAVER Cloud, and HITS Co., Ltd., the project aims to achieve: - Clinical diagnostic AI utilizing medical knowledge systems. - AI-based molecular target exploration for new drug development. - Commercialization of an extendible AI inference platform.
Professor JongChul Ye, Director of KAIST's Digital Bio-Health AI Research Center, stated, "At a time when competition in AI inference model development is intensifying, it is a great honor for KAIST to lead the development of AI technology specialized in the bio and medical fields with world-class young researchers." He added, "We will do our best to ensure that the participating young researchers reach a world-leading level in terms of research achievements after the completion of this seven-year project starting in 2025."
The AI Star Fellowship is a newly established program where post-doctoral researchers and faculty members within seven years of appointment participate as project leaders (PLs) to independently lead research. Multiple laboratories within a university and demand-side companies form a consortium to operate the program.
Through this initiative, KAIST plans to nurture bio-medical convergence AI talent and simultaneously promote the commercialization of core technologies in collaboration with Samsung Medical Center, NAVER Cloud, and HITS.
KAIST Accelerates Synthetic Microbe Design by Discovering Novel Enzymes Using AI
< (From left) Professor Sang Yup Lee of the Department of Chemical and Biomolecular Engineering (top), Hongkeun Ji, PhD candidate of the Department of Chemical and Biomolecular Engineering (top), Ha Rim Kim, PhD candidate of the Department of Chemical and Biomolecular Engineering, and Dr. Gi Bae Kim of the BioProcess Engineering Research Center >
Enzymes are proteins that catalyze biochemical reactions within cells and play a pivotal role in metabolic processes. Accordingly, identifying the functions of novel enzymes is a critical task in the construction of microbial cell factories.
A KAIST research team has leveraged artificial intelligence (AI) to design novel enzymes that do not exist in nature, significantly accelerating microbial cell factory development and boosting the potential for next-generation biotechnological applications such as drug development and biofuel production.
KAIST (represented by President Kwang-Hyung Lee) announced on the 21st of April that Distinguished Professor Sang Yup Lee and his team from the Department of Chemical and Biomolecular Engineering have published a review titled “Enzyme Functional Classification Using Artificial Intelligence,” which outlines the advancement of AI-based enzyme function prediction technologies and analyzes how AI has contributed to the discovery and design of new enzymes.
Professor Lee’s team systematically reviewed the development of enzyme function prediction technologies utilizing machine learning and deep learning, offering a comprehensive analysis.
From sequence similarity-based prediction methods to the integration of convolutional neural networks (CNNs), recurrent neural networks (RNNs), graph neural networks (GNNs), and transformer-based large language models, the paper covers a broad range of AI applications. It analyzes how these technologies extract meaningful information from protein sequences and enhance prediction accuracy.
In particular, enzyme function prediction using deep learning goes beyond simple sequence similarity analysis. By automatically extracting structural and evolutionary features embedded in amino acid sequences, deep learning enables more precise predictions of catalytic functions.
This highlights the unique advantages of AI models compared to traditional bioinformatics approaches.
Moreover, the review suggests that the advancement of generative AI will move future research beyond predicting existing functions to generating entirely new enzymes with functions not found in nature. This shift is expected to profoundly impact the trajectory of biotechnology and synthetic biology.
< Figure 1. Extraction of enzyme characteristics and function prediction using various deep learning structures >
Ha Rim Kim, a Ph.D. candidate and co-first author from the Department of Chemical and Biomolecular Engineering, stated, “AI-based enzyme function prediction and enzyme design are highly important across various fields including metabolic engineering, synthetic biology, and healthcare.”
Distinguished Professor Sang Yup Lee added, “AI-powered enzyme function prediction shows the potential to solve diverse biological problems and will significantly contribute to accelerating research across the entire field.”
The review was published on March 28 in Trends in Biotechnology, a leading biotechnology journal issued by Cell Press.
※ Title: Enzyme Functional Classification Using Artificial Intelligence
※DOI: https://doi.org/10.1016/j.tibtech.2025.03.003
※ Author Information: Ha Rim Kim (KAIST, Co-first author), Hongkeun Ji (KAIST, Co-first author), Gi Bae Kim (KAIST, Third author), Sang Yup Lee (KAIST, Corresponding author)
This research was supported by the Ministry of Science and ICT under the project Development of Core Technologies for Advanced Synthetic Biology to Lead the Bio-Manufacturing Industry (aimed at replacing petroleum-based chemicals), and also by joint support from the Ministry of Science and ICT and the Ministry of Health and Welfare for the project Development of Novel Antibiotic Structures Using Deep Learning-Based Synthetic Biology.
KAIST Develops AI-Driven Performance Prediction Model to Advance Space Electric Propulsion Technology
< (From left) PhD candidate Youngho Kim, Professor Wonho Choe, and PhD candidate Jaehong Park from the Department of Nuclear and Quantum Engineering >
Hall thrusters, a key space technology for missions like SpaceX's Starlink constellation and NASA's Psyche asteroid mission, are high-efficiency electric propulsion devices using plasma technology*. The KAIST research team announced that the AI-designed Hall thruster developed for CubeSats will be installed on the KAIST-Hall Effect Rocket Orbiter (K-HERO) CubeSat to demonstrate its in-orbit performance during the fourth launch of the Korean Launch Vehicle called Nuri rocket (KSLV-2) scheduled for November this year.
*Plasma is one of the four states of matter, where gases are heated to high energies, causing them to separate into charged ions and electrons. Plasma is used not only in space electric propulsion but also in semiconductor manufacturing, display processes, and sterilization devices.
On February 3rd, the research team from the KAIST Department of Nuclear and Quantum Engineering’s Electric Propulsion Laboratory, led by Professor Wonho Choe, announced the development of an AI-based technique to accurately predict the performance of Hall thrusters, the engines of satellites and space probes.
Hall thrusters provide high fuel efficiency, requiring minimal propellant to achieve significant acceleration of spacecrafts or satellites while producing substantial thrust relative to power consumption. Due to these advantages, Hall thrusters are widely used in various space missions, including the formation flight of satellite constellations, deorbiting maneuvers for space debris mitigation, and deep space missions such as asteroid exploration.
As the space industry continues to grow during the NewSpace era, the demand for Hall thrusters suited to diverse missions is increasing. To rapidly develop highly efficient, mission-optimized Hall thrusters, it is essential to predict thruster performance accurately from the design phase.
However, conventional methods have limitations, as they struggle to handle the complex plasma phenomena within Hall thrusters or are only applicable under specific conditions, leading to lower prediction accuracy.
The research team developed an AI-based performance prediction technique with high accuracy, significantly reducing the time and cost associated with the iterative design, fabrication, and testing of thrusters. Since 2003, Professor Wonho Choe’s team has been leading research on electric propulsion development in Korea. The team applied a neural network ensemble model to predict thruster performance using 18,000 Hall thruster training data points generated from their in-house numerical simulation tool.
The in-house numerical simulation tool, developed to model plasma physics and thrust performance, played a crucial role in providing high-quality training data. The simulation’s accuracy was validated through comparisons with experimental data from ten KAIST in-house Hall thrusters, with an average prediction error of less than 10%.
< Figure 1. This research has been selected as the cover article for the March 2025 issue (Volume 7, Issue 3) of the AI interdisciplinary journal, Advanced Intelligent Systems. >
The trained neural network ensemble model acts as a digital twin, accurately predicting the Hall thruster performance within seconds based on thruster design variables.
Notably, it offers detailed analyses of performance parameters such as thrust and discharge current, accounting for Hall thruster design variables like propellant flow rate and magnetic field—factors that are challenging to evaluate using traditional scaling laws.
This AI model demonstrated an average prediction error of less than 5% for the in-house 700 W and 1 kW KAIST Hall thrusters and less than 9% for a 5 kW high-power Hall thruster developed by the University of Michigan and the U.S. Air Force Research Laboratory. This confirms the broad applicability of the AI prediction method across different power levels of Hall thrusters.
Professor Wonho Choe stated, “The AI-based prediction technique developed by our team is highly accurate and is already being utilized in the analysis of thrust performance and the development of highly efficient, low-power Hall thrusters for satellites and spacecraft. This AI approach can also be applied beyond Hall thrusters to various industries, including semiconductor manufacturing, surface processing, and coating, through ion beam sources.”
< Figure 2. The AI-based prediction technique developed by the research team accurately predicts thrust performance based on design variables, making it highly valuable for the development of high-efficiency Hall thrusters. The neural network ensemble processes design variables, such as channel geometry and magnetic field information, and outputs key performance metrics like thrust and prediction accuracy, enabling efficient thruster design and performance analysis. >
Additionally, Professor Choe mentioned, “The CubeSat Hall thruster, developed using the AI technique in collaboration with our lab startup—Cosmo Bee, an electric propulsion company—will be tested in orbit this November aboard the K-HERO 3U (30 x 10 x 10 cm) CubeSat, scheduled for launch on the fourth flight of the KSLV-2 Nuri rocket.”
This research was published online in Advanced Intelligent Systems on December 25, 2024 with PhD candidate Jaehong Park as the first author and was selected as the journal’s cover article, highlighting its innovation.
< Figure 3. Image of the 150 W low-power Hall thruster for small and micro satellites, developed in collaboration with Cosmo Bee and the KAIST team. The thruster will be tested in orbit on the K-HERO CubeSat during the KSLV-2 Nuri rocket’s fourth launch in Q4 2025. >
This research was supported by the National Research Foundation of Korea’s Space Pioneer Program (200mN High Thrust Electric Propulsion System Development).
(Paper Title: Predicting Performance of Hall Effect Ion Source Using Machine Learning, DOI: https://doi.org/10.1002/aisy.202400555 )
< Figure 4. Graphs of the predicted thrust and discharge current of KAIST’s 700 W Hall thruster using the AI model (HallNN). The left image shows the Hall thruster operating in KAIST Electric Propulsion Laboratory’s vacuum chamber, while the center and right graphs present the prediction results for thrust and discharge current based on anode mass flow rate. The red lines represent AI predictions, and the blue dots represent experimental results, with a prediction error of less than 5%. >
KAIST Team Develops an Insect-Mimicking Semiconductor to Detect Motion
The recent development of an “intelligent sensor” semiconductor that mimics the optic nerve of insects while operating at ultra-high speeds and low power offers extensive expandability into various innovative technologies. This technology is expected to be applied to various fields including transportation, safety, and security systems, contributing to both industry and society.
On February 19, a KAIST research team led by Professor Kyung Min Kim from the Department of Materials Science and Engineering (DMSE) announced the successful developed an intelligent motion detector by merging various memristor* devices to mimic the visual intelligence** of the optic nerve of insects.
*Memristor: a “memory resistor” whose state of resistance changes depending on the input signal
**Visual intelligence: the ability to interpret visual information and perform calculations within the optic nerve
With the recent advances in AI technology, vision systems are being improved by utilizing AI in various tasks such as image recognition, object detection, and motion analysis. However, existing vision systems typically recognize objects and their behaviour from the received image signals using complex algorithms. This method requires a significant amount of data traffic and higher power consumption, making it difficult to apply in mobile or IoT devices.
Meanwhile, insects are known to be able to effectively process visual information through an optic nerve circuit called the elementary motion detector, allowing them to detect objects and recognize their motion at an advanced level. However, mimicking this pathway using conventional silicon integrated circuit (CMOS) technology requires complex circuits, and its implementation into actual devices has thus been limited.
< Figure 1. Working principle of a biological elementary motion detection system. >
Professor Kyung Min Kim’s research team developed an intelligent motion detecting sensor that operates at a high level of efficiency and ultra-high speeds. The device has a simple structure consisting of only two types of memristors and a resistor developed by the team. The two different memristors each carry out a signal delay function and a signal integration and ignition function, respectively. Through them, the team could directly mimic the optic nerve of insects to analyze object movement.
< Figure 2. (Left) Optical image of the M-EMD device in the left panel (scale bar 200 μm) and SEM image of the device in the right panel (scale bar: 20 μm). (Middle) Responses of the M-EMD in positive direction. (Right) Responses of the M-EMD in negative direction. >
To demonstrate its potential for practical applications, the research team used the newly developed motion detector to design a neuromorphic computing system that can predict the path of a vehicle. The results showed that the device used 92.9% less energy compared to existing technology and predicted motion with more accuracy.
< Figure 3. Neuromorphic computing system configuration based on motion recognition devices >
Professor Kim said, “Insects make use of their very simple visual intelligence systems to detect the motion of objects at a surprising high speed. This research is significant in that we could mimic the functions of a nerve using a memristor device.” He added, “Edge AI devices, such as AI-topped mobile phones, are becoming increasingly important. This research can contribute to the integration of efficient vision systems for motion recognition, so we expect it to be applied to various fields such as autonomous vehicles, vehicle transportation systems, robotics, and machine vision.”
This research, conducted by co-first authors Hanchan Song and Min Gu Lee, both Ph.D. candidates at KAIST DMSE, was published in the online issue of Advanced Materials on January 29.
This research was supported by the Mid-Sized Research Project by the National Research Foundation of Korea, the Next-Generation Intelligent Semiconductor Technology Development Project, the PIM Artificial Intelligence Semiconductor Core Technology Development Project, the National Nano Fab Center, and the Leap Research Project by KAIST.
KAIST Civil Engineering Students named Runner-up at the 2023 ULI Hines Student Competition - Asia Pacific
A team of five students from the Korea Advanced Institute of Science and Technology (KAIST) were awarded second place in a premier urban design student competition hosted by the Urban Land Institute and Hines, 2023 ULI Hines Student Competition - Asia Pacific.
The competition, which was held for the first time in the Asia-Pacific region, is an internationally recognized event which typically attract hundreds of applicants.
Jonah Remigio, Sojung Noh, Estefania Rodriguez, Jihyun Kang, and Ayantu Teshome, who joined forces under the name of “Team Hashtag Development”, were supported by faculty advisors Dr. Albert Han and Dr. Youngchul Kim of the Department of Civil and Environmental Engineering to imagine a more sustainable and enriched way of living in the Jurong district of Singapore.
Their submission, titled “Proposal: The Nest”, analyzed the big data within Singapore, using the data to determine which real estate business strategies would best enhance the quality of living and economy of the region.
Their final design, "The Nest" utilized mixed-use zoning to integrate the site’s scenic waterfront with homes, medical innovation, and sustainable technology, altogether creating a place to innovate, inhabit, and immerse.
< The Nest by Team Hashtag Development (Jonah Remigio, Ayantu Teshome Mossisa, Estefania Ayelen Rodriguez del Puerto, Sojung Noh, Jihyun Kang) ©2023 Urban Land Institute >
Ultimately, the team was recognized for their hard work and determination, imprinting South Korea’s indelible footprint in the arena of international scholastic achievement as they were named to be one of the Finalists on April 13th.
< Members of Team Hashtag Development >
Team Hashtag Development gave a virtual presentation to a jury of six ULI members on April 20th along with the "Team The REAL" from the University of Economics Ho Chi Minh City of Vietnam and "Team Omusubi" from the Waseda University of Japan, the team that submitted the proposal "Jurong Urban Health Campus" which was announced to be the winner on the 31st of May, after the virtual briefing by the top three finalists.