Receive KAIST news by email!
Type your e-mail address here.
by recently order
by view order
See-through exhibitions using smartphones: KAIST develops the AR magic lens, WonderScope
WonderScope shows what’s underneath the surface of an object through an augmented reality technology. < Photo 1. Demonstration at ACM SIGGRAPH > - A KAIST research team led by Professor Woohun Lee from the Department of Industrial Design and Professor Geehyuk Lee from the School of Computing have developed a smartphone “appcessory” called WonderScope that can easily add an augmented reality (AR) perspective to the surface of exhibits - The research won an Honorable Mention for Emerging Technologies Best in Show at ACM SIGGRAPH, one of the largest international conferences on computer graphics and interactions - The technology was improved and validated through real-life applications in three special exhibitions including one at the Geological Museum at the Korea Institute of Geoscience and Mineral Resources (KIGAM) held in 2020, and two at the National Science Museum each in 2021 and 2022 - The technology is expected to be used for public science exhibitions and museums as well as for interactive teaching materials to stimulate children’s curiosity A KAIST research team led by Professor Woohun Lee from the Department of Industrial Design and Professor Geehyuk Lee from the School of Computing developed a novel augmented reality (AR) device, WonderScope, which displays the insides of an object directly from its surface. By installing and connecting WonderScope to a mobile device through Bluetooth, users can see through exhibits as if looking through a magic lens. Many science museums nowadays have incorporated the use of AR apps for mobile devices. Such apps add digital information to the exhibition, providing a unique experience. However, visitors must watch the screen from a certain distance away from the exhibited items, often causing them to focus more on the digital contents rather than the exhibits themselves. In other words, the distance and distractions that exist between the exhibit and the mobile device may actually cause the visitors to feel detached from the exhibition. To solve this problem, museums needed a magic AR lens that could be used directly from the surface of the item. To accomplish this, smartphones must know exactly where on the surface of an object it is placed. Generally, this would require an additional recognition device either on the inside or on the surface of the item, or a special pattern printed on its surface. Realistically speaking, these are impractical solutions, as exhibits would either appear overly complex or face spatial restrictions. WonderScope, on the other hand, uses a much more practical method to identify the location of a smartphone on the surface of an exhibit. First, it reads a small RFID tag attached to the surface of an object, and calculates the location of the moving smartphone by adding its relative movements based on the readings from an optical displacement sensor and an acceleration sensor. The research team also took into consideration the height of the smartphone, and the characteristics of the surface profile in order to calculate the device’s position more accurately. By attaching or embedding RFID tags on exhibits, visitors can easily experience the effects of a magic AR lens through their smartphones. For its wider use, WonderScope must be able to locate itself from various types of exhibit surfaces. To this end, WoderScope uses readings from an optical displacement sensor and an acceleration sensor with complementary characteristics, allowing stable locating capacities on various textures including paper, stone, wood, plastic, acrylic, and glass, as well as surfaces with physical patterns or irregularities. As a result, WonderScope can identify its location from a distance as close as 4 centimeters from an object, also enabling simple three-dimensional interactions near the surface of the exhibits. The research team developed various case project templates and WonderScope support tools to allow the facile production of smartphone apps that use general-purpose virtual reality (VR) and the game engine Unity. WonderScope is also compatible with various types of devices that run on the Android operating system, including smartwatches, smartphones, and tablets, allowing it to be applied to exhibitions in many forms. < Photo 2. Human body model showing demonstration > < Photo 3. Demonstration of the underground mineral exploration game > < Photo 4. Demonstration of Apollo 11 moon exploration experience > The research team developed WonderScope with funding from the science and culture exhibition enhancement support project by the Ministry of Science and ICT. Between October 27, 2020 and February 28, 2021, WonderScope was used to observe underground volcanic activity and the insides of volcanic rocks at “There Once was a Volcano”, a special exhibition held at the Geological Museum in the Korea institute of Geoscience and Mineral Resources (KIGAM). From September 28 to October 3, 2021, it was used to observe the surface of Jung-moon-kyung (a bronze mirror with fine linear design) at the special exhibition “A Bronze Mirror Shines on Science” at the National Science Museum. And from August 2 to October 3, 2022 it was applied to a moon landing simulation at “The Special Exhibition on Moon Exploration”, also at the National Science Museum. Through various field demonstrations over the years, the research team has improved the performance and usability of WonderScope. < Photo 5. Observation of surface corrosion of the main gate > The research team demonstrated WonderScope at the Emerging Technologies forum during ACM SIGGRAPH 2022, a computer graphics and interaction technology conference that was held in Vancouver, Canada between August 8 and 11 this year. At this conference, where the latest interactive technologies are introduced, the team won an Honorable Mention for Best in Show. The judges commented that “WonderScope will be a new technology that provides the audience with a unique joy of participation during their visits to exhibitions and museums.” < Photo 6. Cover of Digital Creativity > WonderScope is a cylindrical “appcessory” module, 5cm in diameter and 4.5cm in height. It is small enough to be easily attached to a smartphone and embedded on most exhibits. Professor Woohun Lee from the KAIST Department of Industrial Design, who supervised the research, said, “WonderScope can be applied to various applications including not only educational, but also industrial exhibitions, in many ways.” He added, “We also expect for it to be used as an interactive teaching tool that stimulates children’s curiosity.” Introductory video of WonderScope: https://www.youtube.com/watch?v=X2MyAXRt7h4&t=7s
Phage resistant Escherichia coli strains developed to reduce fermentation failure
A genome engineering-based systematic strategy for developing phage resistant Escherichia coli strains has been successfully developed through the collaborative efforts of a team led by Professor Sang Yup Lee, Professor Shi Chen, and Professor Lianrong Wang. This study by Xuan Zou et al. was published in Nature Communications in August 2022 and featured in Nature Communications Editors’ Highlights. The collaboration by the School of Pharmaceutical Sciences at Wuhan University, the First Affiliated Hospital of Shenzhen University, and the KAIST Department of Chemical and Biomolecular Engineering has made an important advance in the metabolic engineering and fermentation industry as it solves a big problem of phage infection causing fermentation failure. Systems metabolic engineering is a highly interdisciplinary field that has made the development of microbial cell factories to produce various bioproducts including chemicals, fuels, and materials possible in a sustainable and environmentally friendly way, mitigating the impact of worldwide resource depletion and climate change. Escherichia coli is one of the most important chassis microbial strains, given its wide applications in the bio-based production of a diverse range of chemicals and materials. With the development of tools and strategies for systems metabolic engineering using E. coli, a highly optimized and well-characterized cell factory will play a crucial role in converting cheap and readily available raw materials into products of great economic and industrial value. However, the consistent problem of phage contamination in fermentation imposes a devastating impact on host cells and threatens the productivity of bacterial bioprocesses in biotechnology facilities, which can lead to widespread fermentation failure and immeasurable economic loss. Host-controlled defense systems can be developed into effective genetic engineering solutions to address bacteriophage contamination in industrial-scale fermentation; however, most of the resistance mechanisms only narrowly restrict phages and their effect on phage contamination will be limited. There have been attempts to develop diverse abilities/systems for environmental adaptation or antiviral defense. The team’s collaborative efforts developed a new type II single-stranded DNA phosphorothioation (Ssp) defense system derived from E. coli 3234/A, which can be used in multiple industrial E. coli strains (e.g., E. coli K-12, B and W) to provide broad protection against various types of dsDNA coliphages. Furthermore, they developed a systematic genome engineering strategy involving the simultaneous genomic integration of the Ssp defense module and mutations in components that are essential to the phage life cycle. This strategy can be used to transform E. coli hosts that are highly susceptible to phage attack into strains with powerful restriction effects on the tested bacteriophages. This endows hosts with strong resistance against a wide spectrum of phage infections without affecting bacterial growth and normal physiological function. More importantly, the resulting engineered phage-resistant strains maintained the capabilities of producing the desired chemicals and recombinant proteins even under high levels of phage cocktail challenge, which provides crucial protection against phage attacks. This is a major step forward, as it provides a systematic solution for engineering phage-resistant bacterial strains, especially industrial bioproduction strains, to protect cells from a wide range of bacteriophages. Considering the functionality of this engineering strategy with diverse E. coli strains, the strategy reported in this study can be widely extended to other bacterial species and industrial applications, which will be of great interest to researchers in academia and industry alike. Fig. A schematic model of the systematic strategy for engineering phage-sensitive industrial E. coli strains into strains with broad antiphage activities. Through the simultaneous genomic integration of a DNA phosphorothioation-based Ssp defense module and mutations of components essential for the phage life cycle, the engineered E. coli strains show strong resistance against diverse phages tested and maintain the capabilities of producing example recombinant proteins, even under high levels of phage cocktail challenge.
KAIST Research Team Proves How a Neurotransmitter may be the Key in Controlling Alzheimer’s Toxicity
With nearly 50 million dementia patients worldwide, and Alzheimers’s disease is the most common neurodegenerative disease. Its main symptom is the impairment of general cognitive abilities, including the ability to speak or to remember. The importance of finding a cure is widely understood with increasingly aging population and the life expectancy being ever-extended. However, even the cause of the grim disease is yet to be given a clear definition. A KAIST research team in the Department of Chemistry led by professor Mi Hee Lim took on a lead to discovered a new role for somatostatin, a protein-based neurotransmitter, in reducing the toxicity caused in the pathogenic mechanism taken towards development of Alzheimer’s disease. The study was published in the July issue of Nature Chemistry under the title, “Conformational and functional changes of the native neuropeptide somatostatin occur in the presence of copper and amyloid-β”. According to the amyloid hypothesis, the abnormal deposition of Aβ proteins causes death of neuronal cells. While Aβ agglomerations make up most of the aged plaques through fibrosis, in recent studies, high concentrations of transitional metal were found in the plaques from Alzheimer’s patients. This suggests a close interaction between metallic ions and Aβ, which accelerates the fibrosis of proteins. Copper in particular is a redox-activating transition metal that can produce large amounts of oxygen and cause serious oxidative stress on cell organelles. Aβ proteins and transition metals can closely interact with neurotransmitters at synapses, but the direct effects of such abnormalities on the structure and function of neurotransmitters are yet to be understood. Figure 1. Functional shift of somatostatin (SST) by factors in the pathogenesis of Alzheimer's disease. Figure 2. Somatostatin’s loss-of-function as neurotransmitter. a. Schematic diagram of SST auto-aggregation due to Alzheimer's pathological factors. b. SST’s aggregation by copper ions. c. Coordination-prediction structure and N-terminal folding of copper-SST. d. Inhibition of SST receptor binding specificity by metals. In their research, Professor Lim’s team discovered that when somatostatin, the protein-based neurotransmitter, is met with copper, Aβ, and metal-Aβ complexes, self-aggregates and ceases to perform its innate function of transmitting neural signals, but begins to attenuate the toxicity and agglomeration of metal-Aβ complexes. Figure 3. Gain-of-function of somatostatin (SST) in the dementia setting. a. Prediction of docking of SST and amyloid beta. b. SST making metal-amyloid beta aggregates into an amorphous form. c. Cytotoxic mitigation effect of SST. d. SST mitigating the interaction between amyloid beta protein with the cell membrane. This research, by Dr. Jiyeon Han et al. from the KAIST Department of Chemistry, revealed the coordination structure between copper and somatostatin at a molecular level through which it suggested the agglomeration mechanism, and discovered the effects of somatostatin on Aβ agglomeration path depending on the presence or absence of metals. The team has further confirmed somatostatin’s receptor binding, interactions with cell membranes, and effects on cell toxicity for the first time to receive international attention. Professor Mi Hee Lim said, “This research has great significance in having discovered a new role of neurotransmitters in the pathogenesis of Alzheimer’s disease.” “We expect this research to contribute to defining the pathogenic network of neurodegenerative diseases caused by aging, and to the development of future biomarkers and medicine,” she added. This research was conducted jointly by Professor Seung-Hee Lee’s team of KAIST Department of Biological Sciences, Professor Kiyoung Park’s Team of KAIST Department of Chemistry, and Professor Yulong Li’s team of Peking University. The research was funded by Basic Science Research Program of the National Research Foundation of Korea and KAIST. For more information about the research team, visit the website: https://sites.google.com/site/miheelimlab/1-professor-mi-hee-lim.
A System for Stable Simultaneous Communication among Thousands of IoT Devices
A mmWave Backscatter System, developed by a team led by Professor Song Min Kim is exciting news for the IoT market as it will be able to provide fast and stable connectivity even for a massive network, which could finally allow IoT devices to reach their full potential. A research team led by Professor Song Min Kim of the KAIST School of Electrical Engineering developed a system that can support concurrent communications for tens of millions of IoT devices using backscattering millimeter-level waves (mmWave). With their mmWave backscatter method, the research team built a design enabling simultaneous signal demodulation in a complex environment for communication where tens of thousands of IoT devices are arranged indoors. The wide frequency range of mmWave exceeds 10GHz, which provides great scalability. In addition, backscattering reflects radiated signals instead of wirelessly creating its own, which allows operation at ultralow power. Therefore, the mmWave backscatter system offers internet connectivity on a mass scale to IoT devices at a low installation cost. This research by Kangmin Bae et al. was presented at ACM MobiSys 2022. At this world-renowned conference for mobile systems, the research won the Best Paper Award under the title “OmniScatter: Sensitivity mmWave Backscattering Using Commodity FMCW Radar”. It is meaningful that members of the KAIST School of Electrical Engineering have won the Best Paper Award at ACM MobiSys for two consecutive years, as last year was the first time the award was presented to an institute from Asia. IoT, as a core component of 5G/6G network, is showing exponential growth, and is expected to be part of a trillion devices by 2035. To support the connection of IoT devices on a mass scale, 5G and 6G each aim to support ten times and 100 times the network density of 4G, respectively. As a result, the importance of practical systems for large-scale communication has been raised. The mmWave is a next-generation communication technology that can be incorporated in 5G/6G standards, as it utilizes carrier waves at frequencies between 30 to 300GHz. However, due to signal reduction at high frequencies and reflection loss, the current mmWave backscatter system enables communication in limited environments. In other words, it cannot operate in complex environments where various obstacles and reflectors are present. As a result, it is limited to the large-scale connection of IoT devices that require a relatively free arrangement. The research team found the solution in the high coding gain of an FMCW radar. The team developed a signal processing method that can fundamentally separate backscatter signals from ambient noise while maintaining the coding gain of the radar. They achieved a receiver sensitivity of over 100 thousand times that of previously reported FMCW radars, which can support communication in practical environments. Additionally, given the radar’s property where the frequency of the demodulated signal changes depending on the physical location of the tag, the team designed a system that passively assigns them channels. This lets the ultralow-power backscatter communication system to take full advantage of the frequency range at 10 GHz or higher. The developed system can use the radar of existing commercial products as gateway, making it easily compatible. In addition, since the backscatter system works at ultralow power levels of 10uW or below, it can operate for over 40 years with a single button cell and drastically reduce installation and maintenance costs. The research team confirmed that mmWave backscatter devices arranged randomly in an office with various obstacles and reflectors could communicate effectively. The team then took things one step further and conducted a successful trace-driven evaluation where they simultaneously received information sent by 1,100 devices. Their research presents connectivity that greatly exceeds network density required by next-generation communication like 5G and 6G. The system is expected to become a stepping stone for the hyper-connected future to come. Professor Kim said, “mmWave backscatter is the technology we’ve dreamt of. The mass scalability and ultralow power at which it can operate IoT devices is unmatched by any existing technology”. He added, “We look forward to this system being actively utilized to enable the wide availability of IoT in the hyper-connected generation to come”. To demonstrate the massive connectivity of the system, a trace-driven evaluation of 1,100 concurrent tag transmissions are made. Figure shows the demodulation result of each and every 1,100 tags as red triangles, where they successfully communicate without collision. This work was supported by Samsung Research Funding & Incubation Center of Samsung Electronics and by the ITRC (Information Technology Research Center) support program supervised by the IITP (Institute of Information & Communications Technology Planning & Evaluation). Profile: Song Min Kim, Ph.D.Professorsongmin@kaist.ac.krhttps://smile.kaist.ac.kr SMILE Lab.School of Electrical Engineering
CXL-Based Memory Disaggregation Technology Opens Up a New Direction for Big Data Solution Frameworks
A KAIST team’s compute express link (CXL) provides new insights on memory disaggregation and ensures direct access and high-performance capabilities A team from the Computer Architecture and Memory Systems Laboratory (CAMEL) at KAIST presented a new compute express link (CXL) solution whose directly accessible, and high-performance memory disaggregation opens new directions for big data memory processing. Professor Myoungsoo Jung said the team’s technology significantly improves performance compared to existing remote direct memory access (RDMA)-based memory disaggregation. CXL is a peripheral component interconnect-express (PCIe)-based new dynamic multi-protocol made for efficiently utilizing memory devices and accelerators. Many enterprise data centers and memory vendors are paying attention to it as the next-generation multi-protocol for the era of big data. Emerging big data applications such as machine learning, graph analytics, and in-memory databases require large memory capacities. However, scaling out the memory capacity via a prior memory interface like double data rate (DDR) is limited by the number of the central processing units (CPUs) and memory controllers. Therefore, memory disaggregation, which allows connecting a host to another host’s memory or memory nodes, has appeared. RDMA is a way that a host can directly access another host’s memory via InfiniBand, the commonly used network protocol in data centers. Nowadays, most existing memory disaggregation technologies employ RDMA to get a large memory capacity. As a result, a host can share another host’s memory by transferring the data between local and remote memory. Although RDMA-based memory disaggregation provides a large memory capacity to a host, two critical problems exist. First, scaling out the memory still needs an extra CPU to be added. Since passive memory such as dynamic random-access memory (DRAM), cannot operate by itself, it should be controlled by the CPU. Second, redundant data copies and software fabric interventions for RDMA-based memory disaggregation cause longer access latency. For example, remote memory access latency in RDMA-based memory disaggregation is multiple orders of magnitude longer than local memory access. To address these issues, Professor Jung’s team developed the CXL-based memory disaggregation framework, including CXL-enabled customized CPUs, CXL devices, CXL switches, and CXL-aware operating system modules. The team’s CXL device is a pure passive and directly accessible memory node that contains multiple DRAM dual inline memory modules (DIMMs) and a CXL memory controller. Since the CXL memory controller supports the memory in the CXL device, a host can utilize the memory node without processor or software intervention. The team’s CXL switch enables scaling out a host’s memory capacity by hierarchically connecting multiple CXL devices to the CXL switch allowing more than hundreds of devices. Atop the switches and devices, the team’s CXL-enabled operating system removes redundant data copy and protocol conversion exhibited by conventional RDMA, which can significantly decrease access latency to the memory nodes. In a test comparing loading 64B (cacheline) data from memory pooling devices, CXL-based memory disaggregation showed 8.2 times higher data load performance than RDMA-based memory disaggregation and even similar performance to local DRAM memory. In the team’s evaluations for a big data benchmark such as a machine learning-based test, CXL-based memory disaggregation technology also showed a maximum of 3.7 times higher performance than prior RDMA-based memory disaggregation technologies. “Escaping from the conventional RDMA-based memory disaggregation, our CXL-based memory disaggregation framework can provide high scalability and performance for diverse datacenters and cloud service infrastructures,” said Professor Jung. He went on to stress, “Our CXL-based memory disaggregation research will bring about a new paradigm for memory solutions that will lead the era of big data.” -Profile: Professor Myoungsoo Jung Computer Architecture and Memory Systems Laboratory (CAMEL)http://camelab.org School of Electrical EngineeringKAIST
Scientist Discover How Circadian Rhythm Can Be Both Strong and Flexible
Study reveals that master and slave oscillators function via different molecular mechanisms From tiny fruit flies to human beings, all animals on Earth maintain their daily rhythms based on their internal circadian clock. The circadian clock enables organisms to undergo rhythmic changes in behavior and physiology based on a 24-hour circadian cycle. For example, our own biological clock tells our brain to release melatonin, a sleep-inducing hormone, at night time. The discovery of the molecular mechanism of the circadian clock was bestowed the Nobel Prize in Physiology or Medicine 2017. From what we know, no one centralized clock is responsible for our circadian cycles. Instead, it operates in a hierarchical network where there are “master pacemaker” and “slave oscillator”. The master pacemaker receives various input signals from the environment such as light. The master then drives the slave oscillator that regulates various outputs such as sleep, feeding, and metabolism. Despite the different roles of the pacemaker neurons, they are known to share common molecular mechanisms that are well conserved in all lifeforms. For example, interlocked systems of multiple transcriptional-translational feedback loops (TTFLs) composed of core clock proteins have been deeply studied in fruit flies. However, there is still much that we need to learn about our own biological clock. The hierarchically-organized nature of master and slave clock neurons leads to a prevailing belief that they share an identical molecular clockwork. At the same time, the different roles they serve in regulating bodily rhythms also raise the question of whether they might function under different molecular clockworks. Research team led by Professor Kim Jae Kyoung from the Department of Mathematical Sciences, a chief investigator at the Biomedical Mathematics Group at the Institute for Basic Science, used a combination of mathematical and experimental approaches using fruit flies to answer this question. The team found that the master clock and the slave clock operate via different molecular mechanisms. In both master and slave neurons of fruit flies, a circadian rhythm-related protein called PER is produced and degraded at different rates depending on the time of the day. Previously, the team found that the master clock neuron (sLNvs) and the slave clock neuron (DN1ps) have different profiles of PER in wild-type and Clk-Δ mutant Drosophila. This hinted that there might be a potential difference in molecular clockworks between the master and slave clock neurons. However, due to the complexity of the molecular clockwork, it was challenging to identify the source of such differences. Thus, the team developed a mathematical model describing the molecular clockworks of the master and slave clocks. Then, all possible molecular differences between the master and slave clock neurons were systematically investigated by using computer simulations. The model predicted that PER is more efficiently produced and then rapidly degraded in the master clock compared to the slave clock neurons. This prediction was then confirmed by the follow-up experiments using animal. Then, why do the master clock neurons have such different molecular properties from the slave clock neurons? To answer this question, the research team again used the combination of mathematical model simulation and experiments. It was found that the faster rate of synthesis of PER in the master clock neurons allows them to generate synchronized rhythms with a high level of amplitude. Generation of such a strong rhythm with high amplitude is critical to delivering clear signals to slave clock neurons. However, such strong rhythms would typically be unfavorable when it comes to adapting to environmental changes. These include natural causes such as different daylight hours across summer and winter seasons, up to more extreme artificial cases such as jet lag that occurs after international travel. Thanks to the distinct property of the master clock neurons, it is able to undergo phase dispersion when the standard light-dark cycle is disrupted, drastically reducing the level of PER. The master clock neurons can then easily adapt to the new diurnal cycle. Our master pacemaker’s plasticity explains how we can quickly adjust to the new time zones after international flights after just a brief period of jet lag. It is hoped that the findings of this study can have future clinical implications when it comes to treating various disorders that affect our circadian rhythm. Professor Kim notes, “When the circadian clock loses its robustness and flexibility, the circadian rhythms sleep disorders can occur. As this study identifies the molecular mechanism that generates robustness and flexibility of the circadian clock, it can facilitate the identification of the cause of and treatment strategy for the circadian rhythm sleep disorders.” This work was supported by the Human Frontier Science Program. -PublicationEui Min Jeong, Miri Kwon, Eunjoo Cho, Sang Hyuk Lee, Hyun Kim, Eun Young Kim, and Jae Kyoung Kim, “Systematic modeling-driven experiments identify distinct molecularclockworks underlying hierarchically organized pacemaker neurons,” February 22, 2022, Proceedings of the National Academy of Sciences of the United States of America -ProfileProfessor Jae Kyoung KimDepartment of Mathematical SciencesKAIST
Team KAIST Makes Its Presence Felt in the Self-Driving Tech Industry
Team KAIST finishes 4th at the inaugural CES Autonomous Racing Competition Team KAIST led by Professor Hyunchul Shim and Unmanned Systems Research Group (USRG) placed fourth in an autonomous race car competition in Las Vegas last week, making its presence felt in the self-driving automotive tech industry. Team KAIST, beat its first competitor, Auburn University, with speeds of up to 131 mph at the Autonomous Challenge at CES held at the Las Vegas Motor Speedway. However, the team failed to advance to the final round when it lost to PoliMOVE, comprised of the Polytechnic University of Milan and the University of Alabama, the final winner of the $150,000 USD race. A total of eight teams competed in the self-driving race. The race was conducted as a single elimination tournament consisting of multiple rounds of matches. Two cars took turns playing the role of defender and attacker, and each car attempted to outpace the other until one of them was unable to complete the mission. Each team designed the algorithm to control its racecar, the Dallara-built AV-21, which can reach a speed of up to 173 mph, and make it safely drive around the track at high speeds without crashing into the other. The event is the CES version of the Indy Autonomous Challenge, a competition that took place for the first time in October last year to encourage university students from around the world to develop complicated software for autonomous driving and advance relevant technologies. Team KAIST placed 4th at the Indy Autonomous Challenge, which qualified it to participate in this race. “The technical level of the CES race is much higher than last October’s and we had a very tough race. We advanced to the semifinals for two consecutive races. I think our autonomous vehicle technology is proving itself to the world,” said Professor Shim. Professor Shim’s research group has been working on the development of autonomous aerial and ground vehicles for the past 12 years. A self-driving car developed by the lab was certified by the South Korean government to run on public roads. The vehicle the team used cost more than 1 million USD to build. Many of the other teams had to repair their vehicle more than once due to accidents and had to spend a lot on repairs. “We are the only one who did not have any accidents, and this is a testament to our technological prowess,” said Professor Shim. He said the financial funding to purchase pricy parts and equipment for the racecar is always a challenge given the very tight research budget and absence of corporate sponsorships. However, Professor Shim and his research group plan to participate in the next race in September and in the 2023 CES race. “I think we need more systemic and proactive research and support systems to earn better results but there is nothing better than the group of passionate students who are taking part in this project with us,” Shim added.
AI Weather Forecasting Research Center Opens
The Kim Jaechul Graduate School of AI in collaboration with the National Institute of Meteorological Sciences (NIMS) under the National Meteorological Administration launched the AI Weather Forecasting Research Center last month. The KAIST AI Weather Forecasting Research Center headed by Professor Seyoung Yoon was established with funding from from the AlphaWeather Development Research Project of the National Institute of Meteorological Sciences. KAIST was finally selected asas the project facilitator. AlphaWeather is an AI system that utilizes and analyzes approximately approximately 150,000 ,000 pieces of weather information per hour to help weather forecasters produce accurate weather forecasts. The research center is composed of three research teams with the following goals: (a) developdevelop AI technology for precipitation nowcasting, (b) developdevelop AI technology for accelerating physical process-based numerical models, and (c) develop dAI technology for supporting weather forecasters. The teams consist of 15 staff member members from NIMS and 61 researchers from the Kim Jaechul Graduate School of AI at KAIST. The research center is developing an AI algorithm for precipitation nowcasting (with up to six hours of lead time), which uses satellite images, radar reflectivity, and data collected from weather stations. It is also developing an AI algorithm for correcting biases in the prediction results from multiple numerical models. Finally, it is Finally, it is developing AI technology that supports weather forecasters by standardizing and automating repetitive manual processes. After verification, the the results obtained will be used by by the Korean National Weather Service as a next-generation forecasting/special-reporting system intelligence engine from 2026.
Team KAIST to Race at CES 2022 Autonomous Challenge
Five top university autonomous racing teams will compete in a head-to-head passing competition in Las Vegas A self-driving racing team from the KAIST Unmanned System Research Group (USRG) advised by Professor Hyunchul Shim will compete at the Autonomous Challenge at the Consumer Electronic Show (CES) on January 7, 2022. The head-to-head, high speed autonomous racecar passing competition at the Las Vegas Motor Speedway will feature the finalists and semifinalists from the Indy Autonomous Challenge in October of this year. Team KAIST qualified as a semifinalist at the Indy Autonomous Challenge and will join four other university teams including the winner of the competition, Technische Universität München. Team KAIST’s AV-21 vehicle is capable of driving on its own at more than 200km/h will be expected to show a speed of more than 300 km/h at the race.The participating teams are:1. KAIST2. EuroRacing : University of Modena and Reggio Emilia (Italy), University of Pisa (Italy), ETH Zürich (Switzerland), Polish Academy of Sciences (Poland) 3. MIT-PITT-RW, Massachusetts Institute of Technology, University of Pittsburgh, Rochester Institute of Technology, University of Waterloo (Canada)4.PoliMOVE – Politecnico di Milano (Italy), University of Alabama 5.TUM Autonomous Motorsport – Technische Universität München (Germany) Professor Shim’s team is dedicated to the development and validation of cutting edge technologies for highly autonomous vehicles. In recognition of his pioneering research in unmanned system technologies, Professor Shim was honored with the Grand Prize of the Minister of Science and ICT on December 9. “We began autonomous vehicle research in 2009 when we signed up for Hyundai Motor Company’s Autonomous Driving Challenge. For this, we developed a complete set of in-house technologies such as low-level vehicle control, perception, localization, and decision making.” In 2019, the team came in third place in the Challenge and they finally won this year. For years, his team has participated in many unmanned systems challenges at home and abroad, gaining recognition around the world. The team won the inaugural 2016 IROS autonomous drone racing and placed second in the 2018 IROS Autonomous Drone Racing Competition. They also competed in 2017 MBZIRC, ranking fourth in Missions 2 and 3, and fifth in the Grand Challenge. Most recently, the team won the first round of Lockheed Martin’s Alpha Pilot AI Drone Innovation Challenge. The team is now participating in the DARPA Subterranean Challenge as a member of Team CoSTAR with NASA JPL, MIT, and Caltech. “We have accumulated plenty of first-hand experience developing autonomous vehicles with the support of domestic companies such as Hyundai Motor Company, Samsung, LG, and NAVER. In 2017, the autonomous vehicle platform “EureCar” that we developed in-house was authorized by the Korean government to lawfully conduct autonomous driving experiment on public roads,” said Professor Shim. The team has developed various key technologies and algorithms related to unmanned systems that can be categorized into three major components: perception, planning, and control. Considering the characteristics of the algorithms that make up each module, their technology operates using a distributed computing system. Since 2015, the team has been actively using deep learning algorithms in the form of perception subsystems. Contextual information extracted from multi-modal sensory data gathered via cameras, lidar, radar, GPS, IMU, etc. is forwarded to the planning subsystem. The planning module is responsible for the decision making and planning required for autonomous driving such as lane change determination and trajectory planning, emergency stops, and velocity command generation. The results from the planner are fed into the controller to follow the planned high-level command. The team has also developed and verified the possibility of an end-to-end deep learning based autonomous driving approach that replaces a complex system with one single AI network.
Scientists Develop Wireless Networks that Allow Brain Circuits to Be Controlled Remotely through the Internet
Wireless implantable devices and IoT could manipulate the brains of animals from anywhere around the world due to their minimalistic hardware, low setup cost, ease of use, and customizable versatility A new study shows that researchers can remotely control the brain circuits of numerous animals simultaneously and independently through the internet. The scientists believe this newly developed technology can speed up brain research and various neuroscience studies to uncover basic brain functions as well as the underpinnings of various neuropsychiatric and neurological disorders. A multidisciplinary team of researchers at KAIST, Washington University in St. Louis, and the University of Colorado, Boulder, created a wireless ecosystem with its own wireless implantable devices and Internet of Things (IoT) infrastructure to enable high-throughput neuroscience experiments over the internet. This innovative technology could enable scientists to manipulate the brains of animals from anywhere around the world. The study was published in the journal Nature Biomedical Engineering on November 25 “This novel technology is highly versatile and adaptive. It can remotely control numerous neural implants and laboratory tools in real-time or in a scheduled way without direct human interactions,” said Professor Jae-Woong Jeong of the School of Electrical Engineering at KAIST and a senior author of the study. “These wireless neural devices and equipment integrated with IoT technology have enormous potential for science and medicine.” The wireless ecosystem only requires a mini-computer that can be purchased for under $45, which connects to the internet and communicates with wireless multifunctional brain probes or other types of conventional laboratory equipment using IoT control modules. By optimally integrating the versatility and modular construction of both unique IoT hardware and software within a single ecosystem, this wireless technology offers new applications that have not been demonstrated before by a single standalone technology. This includes, but is not limited to minimalistic hardware, global remote access, selective and scheduled experiments, customizable automation, and high-throughput scalability. “As long as researchers have internet access, they are able to trigger, customize, stop, validate, and store the outcomes of large experiments at any time and from anywhere in the world. They can remotely perform large-scale neuroscience experiments in animals deployed in multiple countries,” said one of the lead authors, Dr. Raza Qazi, a researcher with KAIST and the University of Colorado, Boulder. “The low cost of this system allows it to be easily adopted and can further fuel innovation across many laboratories,” Dr. Qazi added. One of the significant advantages of this IoT neurotechnology is its ability to be mass deployed across the globe due to its minimalistic hardware, low setup cost, ease of use, and customizable versatility. Scientists across the world can quickly implement this technology within their existing laboratories with minimal budget concerns to achieve globally remote access, scalable experimental automation, or both, thus potentially reducing the time needed to unravel various neuroscientific challenges such as those associated with intractable neurological conditions. Another senior author on the study, Professor Jordan McCall from the Department of Anesthesiology and Center for Clinical Pharmacology at Washington University in St. Louis, said this technology has the potential to change how basic neuroscience studies are performed. “One of the biggest limitations when trying to understand how the mammalian brain works is that we have to study these functions in unnatural conditions. This technology brings us one step closer to performing important studies without direct human interaction with the study subjects.” The ability to remotely schedule experiments moves toward automating these types of experiments. Dr. Kyle Parker, an instructor at Washington University in St. Louis and another lead author on the study added, “This experimental automation can potentially help us reduce the number of animals used in biomedical research by reducing the variability introduced by various experimenters. This is especially important given our moral imperative to seek research designs that enable this reduction.” The researchers believe this wireless technology may open new opportunities for many applications including brain research, pharmaceuticals, and telemedicine to treat diseases in the brain and other organs remotely. This remote automation technology could become even more valuable when many labs need to shut down, such as during the height of the COVID-19 pandemic. This work was supported by grants from the KAIST Global Singularity Research Program, the National Research Foundation of Korea, the United States National Institute of Health, and Oak Ridge Associated Universities. -PublicationRaza Qazi, Kyle Parker, Choong Yeon Kim, Jordan McCall, Jae-Woong Jeong et al. “Scalable and modular wireless-network infrastructure for large-scale behavioral neuroscience,” Nature Biomedical Engineering, November 25 2021 (doi.org/10.1038/s41551-021-00814-w) -ProfileProfessor Jae-Woong JeongBio-Integrated Electronics and Systems LabSchool of Electrical EngineeringKAIST
Industrial Liaison Program to Provide Comprehensive Consultation Services
The ILP’s one-stop solutions target all industrial sectors including conglomerates, small and medium-sized enterprises, venture companies, venture capital (VC) firms, and government-affiliated organizations. The Industrial Liaison Center at KAIST launched the Industrial Liaison Program (ILP) on September 28, an industry-academic cooperation project to provide comprehensive solutions to industry partners. The Industrial Liaison Center will recruit member companies for this service every year, targeting all industrial sectors including conglomerates, small and medium-sized enterprises, venture companies, venture capital (VC) firms, and government-affiliated organizations. The program plans to build a one-stop support system that can systematically share and use excellent resource information from KAIST’s research teams, R&D achievements, and infrastructure to provide member companies with much-needed services. More than 40 KAIST professors with abundant academic-industrial collaboration experience will participate in the program. Experts from various fields with different points of view and experiences will jointly provide solutions to ILP member companies. To actively participate in academic-industrial liaisons and joint consultations, KAIST assigned 10 professors from related fields as program directors. The program directors will come from four different fields including AI/robots (Professor Alice Oh, School from the School of Computing, Professor Young Jae Jang from the Department of Industrial & Systems Engineering, and Professor Yong-Hwa Park from Department of Mechanical Engineering), bio/medicine (Professor Daesoo Kim from Department of Biological Sciences and Professor YongKeun Park from Department of Physics), materials/electronics (Professor Sang Ouk Kim from the Department of Materials Science and Engineering and Professors Jun-Bo Yoon and Seonghwan Cho from the School of Electrical Engineering), and environment/energy (Professor Hee-Tak Kim from the Department of Biological Sciences and Professor Hoon Sohn from the Department of Civil and Environmental Engineering). The transdisciplinary board of consulting professors that will lead technology innovation is composed of 30 professors including Professor Min-Soo Kim (School of Computing, AI), Professor Chan Hyuk Kim (Department of Biological Sciences, medicine), Professor Hae-Won Park (Department of Mechanical Engineering, robots), Professor Changho Suh (School of Electrical Engineering, electronics), Professor Haeshin Lee (Department of Chemistry, bio), Professor Il-Doo Kim (Department of Materials Science and Engineering, materials), Professor HyeJin Kim (School of Business Technology and Management), and Professor Byoung Pil Kim (School of Business Technology and Management, technology law) The Head of the Industrial Liaison Center who is also in charge of the program, Professor Keon Jae Lee, said, “In a science and technology-oriented generation where technological supremacy determines national power, it is indispensable to build a new platform upon which innovative academic-industrial cooperation can be pushed forward in the fields of joint consultation, the development of academic-industrial projects, and the foundation of new industries. He added, “KAIST professors carry out world-class research in many different fields and faculty members can come together through the ILP to communicate with representatives from industry to improve their corporations’ global competitiveness and further contribute to our nation’s interests by cultivating strong small enterprises
A Mechanism Underlying Most Common Cause of Epileptic Seizures Revealed
An interdisciplinary study shows that neurons carrying somatic mutations in MTOR can lead to focal epileptogenesis via non-cell-autonomous hyperexcitability of nearby nonmutated neurons During fetal development, cells should migrate to the outer edge of the brain to form critical connections for information transfer and regulation in the body. When even a few cells fail to move to the correct location, the neurons become disorganized and this results in focal cortical dysplasia. This condition is the most common cause of seizures that cannot be controlled with medication in children and the second most common cause in adults. Now, an interdisciplinary team studying neurogenetics, neural networks, and neurophysiology at KAIST has revealed how dysfunctions in even a small percentage of cells can cause disorder across the entire brain. They published their results on June 28 in Annals of Neurology. The work builds on a previous finding, also by a KAIST scientists, who found that focal cortical dysplasia was caused by mutations in the cells involved in mTOR, a pathway that regulates signaling between neurons in the brain. “Only 1 to 2% of neurons carrying mutations in the mTOR signaling pathway that regulates cell signaling in the brain have been found to include seizures in animal models of focal cortical dysplasia,” said Professor Jong-Woo Sohn from the Department of Biological Sciences. “The main challenge of this study was to explain how nearby non-mutated neurons are hyperexcitable.” Initially, the researchers hypothesized that the mutated cells affected the number of excitatory and inhibitory synapses in all neurons, mutated or not. These neural gates can trigger or halt activity, respectively, in other neurons. Seizures are a result of extreme activity, called hyperexcitability. If the mutated cells upend the balance and result in more excitatory cells, the researchers thought, it made sense that the cells would be more susceptible to hyperexcitability and, as a result, seizures. “Contrary to our expectations, the synaptic input balance was not changed in either the mutated or non-mutated neurons,” said Professor Jeong Ho Lee from the Graduate School of Medical Science and Engineering. “We turned our attention to a protein overproduced by mutated neurons.” The protein is adenosine kinase, which lowers the concentration of adenosine. This naturally occurring compound is an anticonvulsant and works to relax vessels. In mice engineered to have focal cortical dysplasia, the researchers injected adenosine to replace the levels lowered by the protein. It worked and the neurons became less excitable. “We demonstrated that augmentation of adenosine signaling could attenuate the excitability of non-mutated neurons,” said Professor Se-Bum Paik from the Department of Bio and Brain Engineering. The effect on the non-mutated neurons was the surprising part, according to Paik. “The seizure-triggering hyperexcitability originated not in the mutation-carrying neurons, but instead in the nearby non-mutated neurons,” he said. The mutated neurons excreted more adenosine kinase, reducing the adenosine levels in the local environment of all the cells. With less adenosine, the non-mutated neurons became hyperexcitable, leading to seizures. “While we need further investigate into the relationship between the concentration of adenosine and the increased excitation of nearby neurons, our results support the medical use of drugs to activate adenosine signaling as a possible treatment pathway for focal cortical dysplasia,” Professor Lee said. The Suh Kyungbae Foundation, the Korea Health Technology Research and Development Project, the Ministry of Health & Welfare, and the National Research Foundation in Korea funded this work. -Publication:Koh, H.Y., Jang, J., Ju, S.H., Kim, R., Cho, G.-B., Kim, D.S., Sohn, J.-W., Paik, S.-B. and Lee, J.H. (2021), ‘Non–Cell Autonomous Epileptogenesis in Focal Cortical Dysplasia’ Annals of Neurology, 90: 285 299. (https://doi.org/10.1002/ana.26149) -ProfileProfessor Jeong Ho Lee Translational Neurogenetics Labhttps://tnl.kaist.ac.kr/ Graduate School of Medical Science and Engineering KAIST Professor Se-Bum Paik Visual System and Neural Network Laboratory http://vs.kaist.ac.kr/ Department of Bio and Brain EngineeringKAIST Professor Jong-Woo Sohn Laboratory for Neurophysiology, https://sites.google.com/site/sohnlab2014/home Department of Biological SciencesKAIST Dr. Hyun Yong Koh Translational Neurogenetics LabGraduate School of Medical Science and EngineeringKAIST Dr. Jaeson Jang Ph.D.Visual System and Neural Network LaboratoryDepartment of Bio and Brain Engineering KAIST Sang Hyeon Ju M.D.Laboratory for NeurophysiologyDepartment of Biological SciencesKAIST
마지막 페이지 16
KAIST, 291 Daehak-ro, Yuseong-gu, Daejeon 34141, Republic of Korea
Copyright(C) 2020, Korea Advanced Institute of Science and Technology,
All Rights Reserved.