본문 바로가기
대메뉴 바로가기
KAIST
Newsletter Vol.26
Receive KAIST news by email!
View
Subscribe
Close
Type your e-mail address here.
Subscribe
Close
KAIST
NEWS
유틸열기
홈페이지 통합검색
-
검색
KOREAN
메뉴 열기
Virtual+Reality
by recently order
by view order
KAIST's Pioneering VR Precision Technology & Choreography Tool Receives Spotlights at CHI 2025
Accurate pointing in virtual spaces is essential for seamless interaction. If pointing is not precise, selecting the desired object becomes challenging, breaking user immersion and reducing overall experience quality. KAIST researchers have developed a technology that offers a vivid, lifelike experience in virtual space, alongside a new tool that assists choreographers throughout the creative process. KAIST (President Kwang-Hyung Lee) announced on May 13th that a research team led by Professor Sang Ho Yoon of the Graduate School of Culture Technology, in collaboration with Professor Yang Zhang of the University of California, Los Angeles (UCLA), has developed the ‘T2IRay’ technology and the ‘ChoreoCraft’ platform, which enables choreographers to work more freely and creatively in virtual reality. These technologies received two Honorable Mention awards, recognizing the top 5% of papers, at CHI 2025*, the best international conference in the field of human-computer interaction, hosted by the Association for Computing Machinery (ACM) from April 25 to May 1. < (From left) PhD candidates Jina Kim and Kyungeun Jung along with Master's candidate, Hyunyoung Han and Professor Sang Ho Yoon of KAIST Graduate School of Culture Technology and Professor Yang Zhang (top) of UCLA > T2IRay: Enabling Virtual Input with Precision T2IRay introduces a novel input method that allows for precise object pointing in virtual environments by expanding traditional thumb-to-index gestures. This approach overcomes previous limitations, such as interruptions or reduced accuracy due to changes in hand position or orientation. The technology uses a local coordinate system based on finger relationships, ensuring continuous input even as hand positions shift. It accurately captures subtle thumb movements within this coordinate system, integrating natural head movements to allow fluid, intuitive control across a wide range. < Figure 1. T2IRay framework utilizing the delicate movements of the thumb and index fingers for AR/VR pointing > Professor Sang Ho Yoon explained, “T2IRay can significantly enhance the user experience in AR/VR by enabling smooth, stable control even when the user’s hands are in motion.” This study, led by first author Jina Kim, was supported by the Excellent New Researcher Support Project of the National Research Foundation of Korea under the Ministry of Science and ICT, as well as the University ICT Research Center (ITRC) Support Project of the Institute of Information and Communications Technology Planning and Evaluation (IITP). ▴ Paper title: T2IRay: Design of Thumb-to-Index Based Indirect Pointing for Continuous and Robust AR/VR Input▴ Paper link: https://doi.org/10.1145/3706598.3713442 ▴ T2IRay demo video: https://youtu.be/ElJlcJbkJPY ChoreoCraft: Creativity Support through VR for Choreographers In addition, Professor Yoon’s team developed ‘ChoreoCraft,’ a virtual reality tool designed to support choreographers by addressing the unique challenges they face, such as memorizing complex movements, overcoming creative blocks, and managing subjective feedback. ChoreoCraft reduces reliance on memory by allowing choreographers to save and refine movements directly within a VR space, using a motion-capture avatar for real-time interaction. It also enhances creativity by suggesting movements that naturally fit with prior choreography and musical elements. Furthermore, the system provides quantitative feedback by analyzing kinematic factors like motion stability and engagement, helping choreographers make data-driven creative decisions. < Figure 2. ChoreoCraft's approaches to encourage creative process > Professor Yoon noted, “ChoreoCraft is a tool designed to address the core challenges faced by choreographers, enhancing both creativity and efficiency. In user tests with professional choreographers, it received high marks for its ability to spark creative ideas and provide valuable quantitative feedback.” This research was conducted in collaboration with doctoral candidate Kyungeun Jung and master’s candidate Hyunyoung Han, alongside the Electronics and Telecommunications Research Institute (ETRI) and One Million Co., Ltd. (CEO Hye-rang Kim), with support from the Cultural and Arts Immersive Service Development Project by the Ministry of Culture, Sports and Tourism. ▴ Paper title: ChoreoCraft: In-situ Crafting of Choreography in Virtual Reality through Creativity Support Tools▴ Paper link: https://doi.org/10.1145/3706598.3714220 ▴ ChoreoCraft demo video: https://youtu.be/Ms1fwiSBjjw *CHI (Conference on Human Factors in Computing Systems): The premier international conference on human-computer interaction, organized by the ACM, was held this year from April 25 to May 1, 2025.
2025.05.13
View 156
Aline and Blow-yancy Win the Red Dot Design Awards: Brand & Communications Design 2021
‘Aline’ and ‘Blow-yancy’ developed by Professor Sang Su Lee’s team at the Department of Industrial Design won the Red Dot Design Awards in Brand & Communications Design. Aline is a mobile investment portfolio application used in the NH Investment & Securities Co. Blow-yancy is a suva diving VR device for neutral buoyancy training.Professor Lee sought ‘sustainability’ while developing Aline to meet the growing awareness of ESG (environmental, social, and governance) investing. ESG investing relies on independent ratings that help consumers assess a company’s behavior and policies when it comes to its social impact. Aline’s personal value index with six main criteria translates values into sustainable finance. By gathering data from the initial survey and regular value updates, the index is weighted according to the user’s values. Based on the index, the investment portfolio will be adjusted, and consumption against the values will be tracked. Blow-yancy is a diving VR device for neutral buoyancy training. Blow-yancy’s VR mask helps divers feel like they are wearing an actual diving mask. Users can breathe through a regulator with a built-in breathing sensor. It allows training like actual diving without going into the water, therefore enabling safer diving. “We got an idea that about 74% of scuba divers come into contact with corals underwater at least once and that can cause an emergency situation. Divers who cannot maintain neutral buoyance will experience a tough time avoiding them,” said Professor Lee. The hardware consists of a nose covering VR mask, a regulator with a built-in breath sensor, and a controller for virtual BCD control. Blow-yancy’s five virtual missions were organized according to the diving process required by PADI, a professional diving education institute. Professor Lee’s team already received eight recognitions at the iF Design Award in April. Professor Lee said, “We will continue to develop the best UX design items that will improve our global recognition.”
2021.08.26
View 6620
KAIST's Graduate School of Culture Technology Celebrates Its Tenth Anniversary
The Graduate School of Culture Technology (GSCT) at KAIST hosted a ceremony and a variety of events to celebrate its tenth anniversary on October 22, 2015, on campus. Established in 2005 with the support of the Ministry of Culture, Sports and Tourism of the Republic of Korea, GSCT offers an intensive, in-depth education in culture technology, an interdisciplinary field first introduced in Korea by KAIST, which brings arts, humanities, science, and technology together in an academic and research arena. Over the years, the graduate school has fostered top-notch researchers and professionals who have played a leading role in the development of a Korean culture contents industry that includes movies, broadcasting programs, music, games, and culture events. After the anniversary ceremony, GSCT held a "Demo Day" to showcase its major research projects. A total of 41 projects were presented under the themes of “Art and Science,” “Human and Humane,” and “Virtual Reality vs Reality.” In addition, there was a seminar held on GSTC’s ten-year accomplishment and future planning with the school’s Professors Sunghee Lee, Juyong Park, and Juhan Nam; a cultural event for the public called the “Talk Concert,” which included many professionals in culture industry and academia to share ideas and views; and the Homecoming Day for GSTC graduates. So far, the graduate school has produced 295 masters and 34 doctors. About 34% of its graduates are employed in the movie, game, and broadcasting sectors, 33% in the social networking service and Internet industry, and 33% in performing art and exhibition and event. Dong-Man Lee, the Dean of KAIST's Graduate School of Culture Technology, said, “We will continue to develop our school to lead the advancement of the Korean culture industry, contributing to the growth of Korean Wave, the popularity of Korean culture, in the global community.” In the picture below, Dean Lee delivers a speech to celebrate the school’s tenth anniversary. Soo-Man Lee, the founding chairman of S.M. Entertainment, speaks at the Talk Concert. Scenes from the Demo Day
2015.10.26
View 8567
Professor Woontack Woo Demonstrates an Optical Platform Technology for Augmented Reality at Smart Cloud Show
Professor Woontack Woo of the Graduate School of Culture Technology at KAIST participated in the Smart Cloud Show, a technology exhibition, hosted by the university’s Augmented Human Research Center and presented the latest development of his research, an optical platform system for augmented reality. This event took place on September 16-17, 2015 at Grand Seoul Nine Tree Convention Center in Seoul. At the event, Professor Woo introduced a smart glass with an embedded augmented reality system, which permits remote collaboration between an avatar and the user’s hand. The previous remote collaboration was difficult for ordinary users to employ because of its two-dimensional screen and complicated virtual reality system. However, with the new technology, the camera attached to artificial reality (AR) glasses recognizes the user’s hand and tracks it down to collaborate. The avatar in the virtual space and the user’s hand interact in real space and time. The key to this technology is the stable, real-time hand-tracking technique that allows the detection of the hand’s locations and the recognition of finger movements even in situations of self-occlusion. Through this method, a user can touch and manipulate augmented contents as if they were real-life objects, thereby collaborating remotely with another user who is physically distant by linking his or her movements with an avatar. If this technology is adopted widely, it may bring some economic benefits such as increased productivity due to lower costs for mobility and reduction in social overhead costs from the decrease in the need of traveling long distance. Professor Woo said, “This technology will provide us with a greater opportunity for collaboration, not necessarily restricted to physical travelling, which can be widely used in the fields of medicine, education, entertainment, and tourism.” Professor Woo plans to present his research results on hand-movement tracking and detection at the 12th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI 2015), to be held on October 28-30, 2015, at Kintex in Goyang, Korea. He will also present a research paper on remote collaboration at the ICAT-EGVE 2015 conference, the merger of the 25th International Conference on Artificial Reality and Telexistence (ICAT 2015) and the 20th Eurographics Symposium on Virtual Environments (EGVE 2015), which will take place on October 28-30, 2015 at the Kyoto International Community House, Kyoto, Japan.
2015.09.16
View 9403
<<
첫번째페이지
<
이전 페이지
1
>
다음 페이지
>>
마지막 페이지 1