Receive KAIST news by email!
Type your e-mail address here.
by recently order
by view order
Advanced NVMe Controller Technology for Next Generation Memory Devices
KAIST researchers advanced non-volatile memory express (NVMe) controller technology for next generation information storage devices, and made this new technology named ‘OpenExpress’ freely available to all universities and research institutes around the world to help reduce the research cost in related fields. NVMe is a communication protocol made for high-performance storage devices based on a peripheral component interconnect-express (PCI-E) interface. NVMe has been developed to take the place of the Serial AT Attachment (SATA) protocol, which was developed to process data on hard disk drives (HDDs) and did not perform well in solid state drives (SSDs). Unlike HDDs that use magnetic spinning disks, SSDs use semiconductor memory, allowing the rapid reading and writing of data. SSDs also generate less heat and noise, and are much more compact and lightweight. Since data processing in SSDs using NVMe is up to six times faster than when SATA is used, NVMe has become the standard protocol for ultra-high speed and volume data processing, and is currently used in many flash-based information storage devices. Studies on NVMe continue at both the academic and industrial levels, however, its poor accessibility is a drawback. Major information and communications technology (ICT) companies around the world expend astronomical costs to procure intellectual property (IP) related to hardware NVMe controllers, necessary for the use of NVMe. However, such IP is not publicly disclosed, making it difficult to be used by universities and research institutes for research purposes. Although a small number of U.S. Silicon Valley startups provide parts of their independently developed IP for research, the cost of usage is around 34,000 USD per month. The costs skyrocket even further because each copy of single-use source code purchased for IP modification costs approximately 84,000 USD. In order to address these issues, a group of researchers led by Professor Myoungsoo Jung from the School of Electrical Engineering at KAIST developed a next generation NVMe controller technology that achieved parallel data input/output processing for SSDs in a fully hardware automated form. The researchers presented their work at the 2020 USENIX Annual Technical Conference (USENIX ATC ’20) in July, and released it as an open research framework named ‘OpenExpress.’ This NVMe controller technology developed by Professor Jung’s team comprises a wide range of basic hardware IP and key NVMe IP cores. To examine its actual performance, the team made an NVMe hardware controller prototype using OpenExpress, and designed all logics provided by OpenExpress to operate at high frequency. The field-programmable gate array (FPGA) memory card prototype developed using OpenExpress demonstrated increased input/output data processing capacity per second, supporting up to 7 gigabit per second (GB/s) bandwidth. This makes it suitable for ultra-high speed and volume next generation memory device research. In a test comparing various storage server loads on devices, the team’s FPGA also showed 76% higher bandwidth and 68% lower input/output delay compared to Intel’s new high performance SSD (Optane SSD), which is sufficient for many researchers studying systems employing future memory devices. Depending on user needs, silicon devices can be synthesized as well, which is expected to further enhance performance. The NVMe controller technology of Professor Jung’s team can be freely used and modified under the OpenExpress open-source end-user agreement for non-commercial use by all universities and research institutes. This makes it extremely useful for research on next-generation memory compatible NVMe controllers and software stacks. “With the product of this study being disclosed to the world, universities and research institutes can now use controllers that used to be exclusive for only the world’s biggest companies, at no cost,ˮ said Professor Jung. He went on to stress, “This is a meaningful first step in research of information storage device systems such as high-speed and volume next generation memory.” This work was supported by a grant from MemRay, a company specializing in next generation memory development and distribution. More details about the study can be found at http://camelab.org. Image credit: Professor Myoungsoo Jung, KAIST Image usage restrictions: News organizations may use or redistribute these figures and image, with proper attribution, as part of news coverage of this paper only. Publication: Myoungsoo Jung. (2020). OpenExpress: Fully Hardware Automated Open Research Framework for Future Fast NVMe Devices. Presented in the Proceedings of the 2020 USENIX Annual Technical Conference (USENIX ATC ’20), Available online at https://www.usenix.org/system/files/atc20-jung.pdf Profile: Myoungsoo Jung, PhD. Associate Professor firstname.lastname@example.org http://camelab.org Computer Architecture and Memory Systems Laboratory School of Electrical Engineering http://kaist.ac.kr Korea Advanced Institute of Science and Technology (KAIST) Daejeon, Republic of Korea (END)
Professor Jaehyouk Choi, IT Young Engineer of the Year
Professor Jaehyouk Choi from the KAIST School of Electrical Engineering won the ‘IT Young Engineer Award’ for 2020. The award was co-presented by the Institute of Electrical and Electronics Engineers (IEEE) and the Institute of Electronics Engineers of Korea (IEIE), and sponsored by the Haedong Science and Culture Foundation. The ‘IT Young Engineer Award’ selects only one mid-career scientist or engineer 40 years old or younger every year, who has made a great contribution to academic or technological advancements in the field of IT. Professor Choi’s research topics include high-performance semiconductor circuit design for ultrahigh-speed communication systems including 5G communication. In particular, he is widely known for his field of the ‘ultra-low-noise, high-frequency signal generation circuit,’ key technology for next-generation wired and wireless communications, as well as for memory systems. He has published 64 papers in SCI journals and at international conferences, and applied for and registered 25 domestic and international patents. Professor Choi is also an active member of the Technical Program Committee of international symposiums in the field of semiconductor circuits including the International Solid-State Circuits Conference (ISSCC) and the European Solid-State Circuit Conference (ESSCIRC). Beginning this year, he also serves as a distinguished lecturer at the IEEE Solid-State Circuit Society (SSCS). (END)
Quantum Classifiers with Tailored Quantum Kernel
Quantum information scientists have introduced a new method for machine learning classifications in quantum computing. The non-linear quantum kernels in a quantum binary classifier provide new insights for improving the accuracy of quantum machine learning, deemed able to outperform the current AI technology. The research team led by Professor June-Koo Kevin Rhee from the School of Electrical Engineering, proposed a quantum classifier based on quantum state fidelity by using a different initial state and replacing the Hadamard classification with a swap test. Unlike the conventional approach, this method is expected to significantly enhance the classification tasks when the training dataset is small, by exploiting the quantum advantage in finding non-linear features in a large feature space. Quantum machine learning holds promise as one of the imperative applications for quantum computing. In machine learning, one fundamental problem for a wide range of applications is classification, a task needed for recognizing patterns in labeled training data in order to assign a label to new, previously unseen data; and the kernel method has been an invaluable classification tool for identifying non-linear relationships in complex data. More recently, the kernel method has been introduced in quantum machine learning with great success. The ability of quantum computers to efficiently access and manipulate data in the quantum feature space can open opportunities for quantum techniques to enhance various existing machine learning methods. The idea of the classification algorithm with a nonlinear kernel is that given a quantum test state, the protocol calculates the weighted power sum of the fidelities of quantum data in quantum parallel via a swap-test circuit followed by two single-qubit measurements (see Figure 1). This requires only a small number of quantum data operations regardless of the size of data. The novelty of this approach lies in the fact that labeled training data can be densely packed into a quantum state and then compared to the test data. The KAIST team, in collaboration with researchers from the University of KwaZulu-Natal (UKZN) in South Africa and Data Cybernetics in Germany, has further advanced the rapidly evolving field of quantum machine learning by introducing quantum classifiers with tailored quantum kernels.This study was reported at npj Quantum Information in May. The input data is either represented by classical data via a quantum feature map or intrinsic quantum data, and the classification is based on the kernel function that measures the closeness of the test data to training data. Dr. Daniel Park at KAIST, one of the lead authors of this research, said that the quantum kernel can be tailored systematically to an arbitrary power sum, which makes it an excellent candidate for real-world applications. Professor Rhee said that quantum forking, a technique that was invented by the team previously, makes it possible to start the protocol from scratch, even when all the labeled training data and the test data are independently encoded in separate qubits. Professor Francesco Petruccione from UKZN explained, “The state fidelity of two quantum states includes the imaginary parts of the probability amplitudes, which enables use of the full quantum feature space.” To demonstrate the usefulness of the classification protocol, Carsten Blank from Data Cybernetics implemented the classifier and compared classical simulations using the five-qubit IBM quantum computer that is freely available to public users via cloud service. “This is a promising sign that the field is progressing,” Blank noted. Link to download the full-text paper: https://www.nature.com/articles/s41534-020-0272-6 -Profile Professor June-Koo Kevin Rhee email@example.com Professor, School of Electrical Engineering Director, ITRC of Quantum Computing for AIKAIST Daniel Kyungdeock Parkkpark10@kaist.ac.krResearch Assistant ProfessorSchool of Electrical EngineeringKAIST
‘Mole-bot’ Optimized for Underground and Space Exploration
Biomimetic drilling robot provides new insights into the development of efficient drilling technologies Mole-bot, a drilling biomimetic robot designed by KAIST, boasts a stout scapula, a waist inclinable on all sides, and powerful forelimbs. Most of all, the powerful torque from the expandable drilling bit mimicking the chiseling ability of a mole’s front teeth highlights the best feature of the drilling robot. The Mole-bot is expected to be used for space exploration and mining for underground resources such as coalbed methane and Rare Earth Elements (REE), which require highly advanced drilling technologies in complex environments. The research team, led by Professor Hyun Myung from the School of Electrical Engineering, found inspiration for their drilling bot from two striking features of the African mole-rat and European mole. “The crushing power of the African mole-rat’s teeth is so powerful that they can dig a hole with 48 times more power than their body weight. We used this characteristic for building the main excavation tool. And its expandable drill is designed not to collide with its forelimbs,” said Professor Myung. The 25-cm wide and 84-cm long Mole-bot can excavate three times faster with six times higher directional accuracy than conventional models. The Mole-bot weighs 26 kg. After digging, the robot removes the excavated soil and debris using its forelimbs. This embedded muscle feature, inspired by the European mole’s scapula, converts linear motion into a powerful rotational force. For directional drilling, the robot’s elongated waist changes its direction 360° like living mammals. For exploring underground environments, the research team developed and applied new sensor systems and algorithms to identify the robot’s position and orientation using graph-based 3D Simultaneous Localization and Mapping (SLAM) technology that matches the Earth’s magnetic field sequence, which enables 3D autonomous navigation underground. According to Market & Market’s survey, the directional drilling market in 2016 is estimated to be 83.3 billion USD and is expected to grow to 103 billion USD in 2021. The growth of the drilling market, starting with the Shale Revolution, is likely to expand into the future development of space and polar resources. As initiated by Space X recently, more attention for planetary exploration will be on the rise and its related technology and equipment market will also increase. The Mole-bot is a huge step forward for efficient underground drilling and exploration technologies. Unlike conventional drilling processes that use environmentally unfriendly mud compounds for cleaning debris, Mole-bot can mitigate environmental destruction. The researchers said their system saves on cost and labor and does not require additional pipelines or other ancillary equipment. “We look forward to a more efficient resource exploration with this type of drilling robot. We also hope Mole-bot will have a very positive impact on the robotics market in terms of its extensive application spectra and economic feasibility,” said Professor Myung. This research, made in collaboration with Professor Jung-Wuk Hong and Professor Tae-Hyuk Kwon’s team in the Department of Civil and Environmental Engineering for robot structure analysis and geotechnical experiments, was supported by the Ministry of Trade, Industry and Energy’s Industrial Technology Innovation Project. Profile Professor Hyun Myung Urban Robotics Lab http://urobot.kaist.ac.kr/ School of Electrical Engineering KAIST
Professor Dongsu Han Named Program Chair for ACM CoNEXT 2020
Professor Dongsu Han from the School of Electrical Engineering has been appointed as the program chair for the 16th Association for Computing Machinery’s International Conference on emerging Networking EXperiments and Technologies (ACM CoNEXT 2020). Professor Han is the first program chair to be appointed from an Asian institution. ACM CoNEXT is hosted by ACM SIGCOMM, ACM's Special Interest Group on Data Communications, which specializes in the field of communication and computer networks. Professor Han will serve as program co-chair along with Professor Anja Feldmann from the Max Planck Institute for Informatics. Together, they have appointed 40 world-leading researchers as program committee members for this conference, including Professor Song Min Kim from KAIST School of Electrical Engineering. Paper submissions for the conference can be made by the end of June, and the event itself is to take place from the 1st to 4th of December. Conference Website: https://conferences2.sigcomm.org/co-next/2020/#!/home (END)
A Theoretical Boost to Nano-Scale Devices
- Researchers calculate the quasi-Fermi levels in molecular junctions applying an initio approach. - Semiconductor companies are struggling to develop devices that are mere nanometers in size, and much of the challenge lies in being able to more accurately describe the underlying physics at that nano-scale. But a new computational approach that has been in the works for a decade could break down these barriers. Devices using semiconductors, from computers to solar cells, have enjoyed tremendous efficiency improvements in the last few decades. Famously, one of the co-founders of Intel, Gordon Moore, observed that the number of transistors in an integrated circuit doubles about every two years—and this ‘Moore’s law’ held true for some time. In recent years, however, such gains have slowed as firms that attempt to engineer nano-scale transistors hit the limits of miniaturization at the atomic level. Researchers with the School of Electrical Engineering at KAIST have developed a new approach to the underlying physics of semiconductors. “With open quantum systems as the main research target of our lab, we were revisiting concepts that had been taken for granted and even appear in standard semiconductor physics textbooks such as the voltage drop in operating semiconductor devices,” said the lead researcher Professor Yong-Hoon Kim. “Questioning how all these concepts could be understood and possibly revised at the nano-scale, it was clear that there was something incomplete about our current understanding.” “And as the semiconductor chips are being scaled down to the atomic level, coming up with a better theory to describe semiconductor devices has become an urgent task.” The current understanding states that semiconductors are materials that act like half-way houses between conductors, like copper or steel, and insulators, like rubber or Styrofoam. They sometimes conduct electricity, but not always. This makes them a great material for intentionally controlling the flow of current, which in turn is useful for constructing the simple on/off switches—transistors—that are the foundation of memory and logic devices in computers. In order to ‘switch on’ a semiconductor, a current or light source is applied, exciting an electron in an atom to jump from what is called a ‘valence band,’ which is filled with electrons, up to the ‘conduction band,’ which is originally unfilled or only partially filled with electrons. Electrons that have jumped up to the conduction band thanks to external stimuli and the remaining ‘holes’ are now able to move about and act as charge carriers to flow electric current. The physical concept that describes the populations of the electrons in the conduction band and the holes in the valence band and the energy required to make this jump is formulated in terms of the so-called ‘Fermi level.’ For example, you need to know the Fermi levels of the electrons and holes in order to know what amount of energy you are going to get out of a solar cell, including losses. But the Fermi level concept is only straightforwardly defined so long as a semiconductor device is at equilibrium—sitting on a shelf doing nothing—and the whole point of semiconductor devices is not to leave them on the shelf. Some 70 years ago, William Shockley, the Nobel Prize-winning co-inventor of the transistor at the Bell Labs, came up with a bit of a theoretical fudge, the ‘quasi-Fermi level,’ or QFL, enabling rough prediction and measurement of the interaction between valence band holes and conduction band electrons, and this has worked pretty well until now. “But when you are working at the scale of just a few nanometers, the methods to theoretically calculate or experimentally measure the splitting of QFLs were just not available,” said Professor Kim. This means that at this scale, issues such as errors relating to voltage drop take on much greater significance. Kim’s team worked for nearly ten years on developing a novel theoretical description of nano-scale quantum electron transport that can replace the standard method—and the software that allows them to put it to use. This involved the further development of a bit of math known as the Density Functional Theory that simplifies the equations describing the interactions of electrons, and which has been very useful in other fields such as high-throughput computational materials discovery. For the first time, they were able to calculate the QFL splitting, offering a new understanding of the relationship between voltage drop and quantum electron transport in atomic scale devices. In addition to looking into various interesting non-equilibrium quantum phenomena with their novel methodology, the team is now further developing their software into a computer-aided design tool to be used by semiconductor companies for developing and fabricating advanced semiconductor devices. The study, featured at the Proceedings of the National Academy of Sciences of the USA on May 12, was supported by the National Research Foundation and the Korea Institute of Science and Technology Information Supercomputing Center. Image caption: The newly developed formalism and QFL splitting analysis led to new ways of characterizing extremely scaled-down semiconductor devices and the technology computer-aided design (TCAD) of next- generation nano-electronic/energy/bio devices. Image credit: Yong-Hoon Kim, KAIST Image usage restrictions: News organizations may use or redistribute this image, with proper attribution, as part of news coverage of this paper only. Publication: Juho Lee, Hyeonwoo Yeo, and Yong-Hoon Kim. (2020) ‘Quasi-Fermi level splitting in nanoscale junctions from ab initio.’ Proceedings of the National Academy of Sciences of the United States of America (PNAS), Volume 117, Issue 19, pp.10142-101488. Available online at https://doi.org/10.1073/pnas.1921273117 Profile: Yong-Hoon Kim Professor firstname.lastname@example.org http://nanocore.kaist.ac.kr/ 1st-Principles Nano-Device Computing Lab School of Electrical Engineering KAIST (END)
Stress-Relief Substrate Helps OLED Stretch Two-Dimensionally
Highly functional and free-form displays are critical components to complete the technological prowess of wearable electronics, robotics, and human-machine interfaces. A KAIST team created stretchable OLEDs (Organic Light-Emitting Diodes) that are compliant and maintain their performance under high-strain deformation. Their stress-relief substrates have a unique structure and utilize pillar arrays to reduce the stress on the active areas of devices when strain is applied. Traditional intrinsically stretchable OLEDs have commercial limitations due to their low efficiency in the electrical conductivity of the electrodes. In addition, previous geometrically stretchable OLEDs laminated to the elastic substrates with thin film devices lead to different pixel emissions of the devices from different peak sizes of the buckles. To solve these problems, a research team led by Professor Kyung Cheol Choi designed a stretchable substrate system with surface relief island structures that relieve the stress at the locations of bridges in the devices. Their stretchable OLED devices contained an elastic substrate structure comprising bonded elastic pillars and bridges. A patterned upper substrate with bridges makes the rigid substrate stretchable, while the pillars decentralize the stress on the device. Although various applications using micropillar arrays have been reported, it has not yet been reported how elastic pillar arrays can affect substrates by relieving the stress applied to those substrates upon stretching. Compared to results using similar layouts with conventional free-standing, flat substrates or island structures, their results with elastic pillar arrays show relatively low stress levels at both the bridges and plates when stretching the devices. They achieved stretchable RGB (red, green, blue) OLEDs and had no difficulties with material selection as practical processes were conducted with stress-relief substrates. Their stretchable OLEDs were mechanically stable and have two-dimensional stretchability, which is superior to only one-direction stretchable electronics, opening the way for practical applications like wearable electronics and health monitoring systems. Professor Choi said, “Our substrate design will impart flexibility into electronics technology development including semiconductor and circuit technologies. We look forward this new stretchable OLED lowering the barrier for entering the stretchable display market.” This research was published in Nano Letters titled Two-Dimensionally Stretchable Organic Light-Emitting Diode with Elastic Pillar Arrays for Stress Relief. (https://dx.doi.org/10.1021/acs.nanolett.9b03657). This work was supported by the Engineering Research Center of Excellence Program supported by the National Research Foundation of Korea. -Profile Professor Kyung Cheol Choi email@example.com http://adnc.kaist.ac.kr/ School of Electrical Engineering KAIST
Professor Minsoo Rhu Recognized as Facebook Research Scholar
Professor Minsoo Rhu from the School of Electrical Engineering was selected as the recipient of the Systems for Machine Learning Research Awards presented by Facebook. Facebook launched the award last year with the goal of funding impactful solutions in the areas of developer tookits, compilers and code generation, system architecture, memory technologies, and machine learning accelerator support. A total of 167 scholars from 100 universities representing 26 countries submitted research proposals, and Facebook selected final 10 scholars. Professor Rhu made the list with his research topic ‘A Near-Memory Processing Architecture for Training Recommendation Systems.’ He will receive 5,000 USD in research funds at the award ceremony which will take place during this year’s AI Systems Faculty Summit at the Facebook headquarters in Menlo Park, California. Professor Rhu’s submission was based on research on ‘Memory-Centric Deep Learning System Architecture’ that he carried out for three years under the auspices of Samsung Science and Technology Foundation from 2017. It was an academic-industrial cooperation research project in which leading domestic companies like Samsung Electronics and SK Hynix collaborated to make a foray into the global memory-centric smart system semiconductor market. Professor Rhu who joined KAIST in 2018 has led various systems research projects to accelerate the AI computing technology while working at NVIDIA headquarters from 2014. (END)
KAIST Showcases Advanced Technologies at CES 2020
< President Sung-Chul Shin experiencing cooling gaming headset developed by TEGWAY > KAIST Pavilion showcased 12 KAIST startups and alumni companies’ technologies at the International Consumer Electronics Show (CES) 2020 held in Las Vegas last month. Especially four companies, TEGWAY, THE.WAVE.TALK, Sherpa Space, and LiBEST won the CES 2020 Innovation Awards presented by the Consumer Technology Association (CTA). The CTA selects the most innovative items from among all submissions. TEGWAY spinned off by KAIST Professor Byung Jin Cho already made international headlines for their flexible, wearable, and temperature immersive thermoelectric device. The device was selected as one of the top ten most promising digital technologies by the Netexplo Forum in 2015, and has been expanded into VR, AR, and games. THE.WAVE.TALK has developed their first home appliance product in collaboration with ID+IM Design Laboratory of KAIST in which Professor Sang-Min Bae heads as creative director. Their real-time bacteria analysis with smart IoT sensor won the home appliances section. Sherpa Space and LiBEST are the alumni companies. Sherpa Space’s lighting for plants won the sustainability, eco-design, and smart energy section, and LiBEST’s full-range flexible battery won the section for technology for a better world. KAIST’s Alumni Association, Development Foundation, and the Office of University-Industry Cooperation (OUIC) made every effort to present KAIST technologies to the global market. President Sung-Chul Shin led the delegation comprising of 70 faculty, researchers, and young entrepreneurs. The KAIST Alumni Association fully funded the traveling costs of 30 alumni entrepreneurs and students, establishing scholarship for the CES participation. Ten young entrepreneurs were selected through the KAIST Startup Awards, and 20 current students preparing to start their own companies were selected via recommendation from the respective departments. Associate Vice President of the OUIC Kyung Cheol Choi said in excitement, “We received many offers for joint research and investment from leading companies around the world,” adding, “We will continue doing our best to generate global value by developing the innovative technologies obtained from education and research into businesses.” The KAIST pavilion at CES 2020 showcased: 1. flexible thermoelectric device ThermoReal and cooling gaming headset from TEGWAY, 2. wearable flexible battery from LiBEST, 3. applications such as conductive transparent electrode film and transparent heating film from J-Micro, 4. on-device AI solution based on deep learning model compression technology from Nota, 5. portable high resolution brain imaging device from OBELAB, 6. real-time bacteria analysis technology from THE.WAVE.TALK, 7. conversation-based AI-1 radio service platform from Timecode Archive, 8. light source solutions for different stages in a plant’s life cycle from Sherpa Space, 9. skin attached micro-LED patch and flexible piezoelectric acoustic sensor from FRONICS, 10. real-time cardiovascular measurement device from Healthrian, 11. block chain based mobile research documentation system from ReDWit, and 12. student-developed comprehensive healthcare device using a smart mirror. (END)
Professor Junil Choi Receives Stephen O. Rice Prize
< Professor Junil Choi (second from the left) > Professor Junil Choi from the School of Electrical Engineering received the Stephen O. Rice Prize at the Global Communications Conference (GLOBECOM) hosted by the Institute of Electrical and Electronics Engineers (IEEE) in Hawaii on December 10, 2019. The Stephen O. Rice Prize is awarded to only one paper of exceptional merit every year. The IEEE Communications Society evaluates all papers published in the IEEE Transactions on Communications journal within the last three years, and marks each paper by aggregating its scores on originality, the number of citations, impact, and peer evaluation. Professor Choi won the prize for his research on one-bit analog-to-digital converters (ADCs) for multiuser massive multiple-input and multiple-output (MIMO) antenna systems published in 2016. In his paper, Professor Choi proposed a technology that can drastically reduce the power consumption of the multiuser massive MIMO antenna systems, which are the core technology for 5G and future wireless communication. Professor Choi’s paper has been cited more than 230 times in various academic journals and conference papers since its publication, and multiple follow-up studies are actively ongoing. In 2015, Professor Choi received the IEEE Signal Processing Society Best Paper Award, an award equals to the Stephen O. Rice Prize. He was also selected as the winner of the 15th Haedong Young Engineering Researcher Award presented by the Korean Institute of Communications and Information Sciences (KICS) on December 6, 2019 for his outstanding academic achievements, including 34 international journal publications and 26 US patent registrations. (END)
KAIST and Google Jointly Develop AI Curricula
KAIST selected the two professors who will develop AI curriculum under the auspices of the KAIST-Google Partnership for AI Education and Research. The Graduate School of AI announced the two authors among the 20 applicants who will develop the curriculum next year. They will be provided 7,500 USD per subject. Professor Changho Suh from the School of Electrical Engineering and Professor Yong-Jin Yoon from the Department of Mechanical Engineering will use Google technology such as TensorFlow, Google Cloud, and Android to create the curriculum. Professor Suh’s “TensorFlow for Information Theory and Convex Optimization “will be used for curriculum in the graduate courses and Professor Yoon’s “AI Convergence Project Based Learning (PBL)” will be used for online courses. Professor Yoon’s course will explore and define problems by utilizing AI and experiencing the process of developing products that use AI through design thinking, which involves product design, production, and verification. Professor Suh’s course will discus“information theory and convergence,” which uses basic sciences and engineering as well as AI, machine learning, and deep learning.
'Flying Drones for Rescue'
(Video Credit: ⓒNASA JPL) < Team USRG and Professor Shim (second from the right) > Having recently won the AI R&D Grand Challenge Competition in Korea, Team USRG (Unmanned System Research Group) led by Professor Hyunchul Shim from the School of Electrical Engineering is all geared up to take on their next challenges: the ‘Defense Advanced Research Projects Agency Subterranean Challenge (DARPA SubT Challenge)’ and ‘Lockheed Martin’s AlphaPilot Challenge’ next month. Team USRG won the obstacle course race in the ‘2019 AI R&D Grand Challenge Competition’ on July 12. They managed to successfully dominate the challenging category of ‘control intelligence.’ Having to complete the obstacle course race solely using AI systems without any connection to the internet made it difficult for most of the eight participating teams to pass the third section of the race, and only Team USRG passed the long pipeline course during their attempt in the main event. They also demonstrated, after the main event, that their drone can navigate all of the checkpoints including landing on the “H” mark using deep learning. Their drone flew through polls and pipes, and escaped from windows and mazes against strong winds, amid cheers and groans from the crowd gathered at the Korea Exhibition Center (KINTEX) in Goyang, Korea. The team was awarded three million KRW in prize money, and received a research grant worth six hundred million KRW from the Ministry of Science and ICT (MSIT). “Being ranked first in the race for which we were never given a chance for a test flight means a lot to our team. Considering that we had no information on the exact size of the course in advance, this is a startling result,” said Professor Shim. “We will carry out further research with this funding, and compete once again with the improved AI and drone technology in the 2020 competition,” he added. The AI R&D Grand Challenge Competition, which was first started in 2017, has been designed to promote AI research and development and expand its application to addressing high-risk technical challenges with significant socio-economic impact. This year’s competition presented participants with a task where they had to develop AI software technology for drones to navigate themselves autonomously during complex disaster relief operations such as aid delivery. Each team participated in one of the four tracks of the competition, and their drones were evaluated based on the criteria for each track. The divisions were broken up into intelligent context-awareness, intelligent character recognition, auditory intelligence, and control intelligence. Team USRG’s technological prowess has been already well acclaimed among international peer groups. Teamed up with NASA JPL, Caltech, and MIT, they will compete in the subterranean mission during the ‘DARPA SubT Challenge’. Team CoSTAR, as its name stands for, is working together to build ‘Collaborative SubTerranean Autonomous Resilient Robots.’ Professor Shim emphasized the role KAIST plays in Team CoSTAR as a leader in drone technology. “I think when our drone technology will be added to our peers’ AI and robotics, Team CoSTAR will bring out unsurpassable synergy in completing the subterrestrial and planetary applications. I would like to follow the footprint of Hubo, the winning champion of the 2015 DARPA Robotics Challenge and even extend it to subterranean exploration,” he said. These next generation autonomous subsurface explorers are now all optimizing the physical AI robot systems developed by Team CoSTAR. They will test their systems in more realistic field environments August 15 through 22 in Pittsburgh, USA. They have already received funding from DARPA for participating. Team CoSTAR will compete in three consecutive yearly events starting this year, and the last event, planned for 2021, will put the team to the final test with courses that incorporate diverse challenges from all three events. Two million USD will be awarded to the winner after the final event, with additional prizes of up to 200,000 USD for self-funded teams. Team USRG also ranked third in the recent Hyundai Motor Company’s ‘Autonomous Vehicle Competition’ and another challenge is on the horizon: Lockheed Martin’s ‘AlphaPilot Challenge’. In this event, the teams will be flying their drones through a series of racing gates, trying to beat the best human pilot. The challenge is hosted by Lockheed Martin, the world’s largest military contractor and the maker of the famed F-22 and F-35 stealth fighters, with the goal of stimulating the development of autonomous drones. Team USRG was selected from out of more than 400 teams from around the world and is preparing for a series of races this fall, beginning from the end of August. Professor Shim said, “It is not easy to perform in a series of competitions in just a few months, but my students are smart, hardworking, and highly motivated. These events indeed demand a lot, but they really challenge the researchers to come up with technologies that work in the real world. This is the way robotics really should be.” (END)
마지막 페이지 7
KAIST, 291 Daehak-ro, Yuseong-gu, Daejeon 34141, Republic of Korea
Copyright(C) 2020, Korea Advanced Institute of Science and Technology,
All Rights Reserved.