Olfaction plays a crucial role in emotions, memory, and sensory interaction. Beginning with our "smell camera" prototype (DIS 2020), we introduced a novel approach to capturing and replaying real-world scents. This talk expands on our user-centered exploration of physical odor capture, highlighting recent advancements such as the integration of sorbent tubes, which improve both portability and functionality. We will also share insights from our toolkit projects, including O&O (CHI), OdorAgents (IEEE VR), and Mul-O (UIST), discussing how these tools support the rapid development and prototyping of olfactory experiences. In line with our mission to empower the olfactory research community, we will conclude by presenting our vision for advancing multisensory innovation in HCI.
Qi Lu (Kilo) is a Research Assistant Professor at the Future Laboratory of Tsinghua University and the Head of the Olfactory Computing and Interface Design Group. Lu holds a bachelor’s, master’s, and Ph.D. from Tsinghua University and was recognized as an Outstanding Ph.D. Graduate in Beijing. Lu has extensive interdisciplinary expertise across electronics and information design. His work includes olfactory interface design toolkits, e-nose design and applications, multisensory HCI and so on.
He has led or participated in over ten key national and provincial research projects, publishing 13 high-impact papers as the first or corresponding author in top conferences, including CHI, UIST, DIS, and TEI, with one Honorable Mention Award (DIS 2020). He has also filed over ten invention patents, with eight granted to date. In 2024, he was selected as part of the First Speakers Lineup at the 8th World Digital Olfaction Society (DOS 2024) conference.
Olfaction plays a crucial role in emotions, memory, and sensory interaction. Beginning with our "smell camera" prototype (DIS 2020), we introduced a novel approach to capturing and replaying real-world scents. This talk expands on our user-centered exploration of physical odor capture, highlighting recent advancements such as the integration of sorbent tubes, which improve both portability and functionality. We will also share insights from our toolkit projects, including O&O (CHI), OdorAgents (IEEE VR), and Mul-O (UIST), discussing how these tools support the rapid development and prototyping of olfactory experiences. In line with our mission to empower the olfactory research community, we will conclude by presenting our vision for advancing multisensory innovation in HCI.
Qi Lu (Kilo) is a Research Assistant Professor at the Future Laboratory of Tsinghua University and the Head of the Olfactory Computing and Interface Design Group. Lu holds a bachelor’s, master’s, and Ph.D. from Tsinghua University and was recognized as an Outstanding Ph.D. Graduate in Beijing. Lu has extensive interdisciplinary expertise across electronics and information design. His work includes olfactory interface design toolkits, e-nose design and applications, multisensory HCI and so on.
He has led or participated in over ten key national and provincial research projects, publishing 13 high-impact papers as the first or corresponding author in top conferences, including CHI, UIST, DIS, and TEI, with one Honorable Mention Award (DIS 2020). He has also filed over ten invention patents, with eight granted to date. In 2024, he was selected as part of the First Speakers Lineup at the 8th World Digital Olfaction Society (DOS 2024) conference.