Professor Han Bing-xuan from the Interaction Department guided the student works to be selected for 2022 SIGGRAPH ASIA

Professor Han Bing-xuan from the Interaction Department guided the student works to be selected for 2022 SIGGRAPH ASIA

Luis Andres Mendez, Wu Haoxian, Lin Jingying, and Lu Yijie, guided by teacher Han Bingxuan from the Interaction Department of our school, were selected and participated in the SIGGRAPH ASIA held in South Korea in 2022 with their "Easy Moving Sandbag" work, as well as the domestic 2022 Smart Innovation and Cross-domain Integrated Creation The first place in the competition of somatosensory interactive technology.

ACM SIGGRAPH ASIA originated in 2008. It holds conferences and exhibitions based in Asian cities every year, covering technology, art, animation, games, interaction, etc., and invites artists, scientists, and development units from all over the world to share their creations and display new technologies.

The "Easy Sandbag" design team found that there are some pain points in boxing training, including: the coach and the students will consume physical strength and have a chance of being injured; one-on-one training cannot take care of more students at the same time; most of the current boxing training products only For striking drills or footwork, there is a lack of training modes that provide both striking and wide-moving footwork. They designed the air sandbag hardware device and virtual VR boxing application, allowing coaches to remotely control the sandbags, encouraging players to do more footwork training, and providing somatosensory feedback and gamification to increase training fun.

In addition, under the guidance of teacher Han Bingxuan, students Lin Kefan, Zhou Yuzhi, Chen Yixuan, and Lin Jingying jointly designed the "Sound Shape", and won the third place in the somatosensory interactive technology of the 2022 Smart Innovation and Cross-domain Integrated Creation Competition. This work creates an immersive music performance that satisfies both visual and auditory enjoyment. Through the optical pickup device LiCAP installed on stringed instruments such as guitars, ukuleles, and violins, AI deep learning is used to accurately judge the player's fingering and rhythm. , strength, etc., are converted into digital audio in real time, and visual images are generated synchronously, and visual effects are triggered by sound.

 

Homepage