Table of Contents
Advanced Topics in Human-Computer Interaction
This course of Advanced Topics in HCI includes discussions about a set of representative papers published in the field of HCI, and creation and demonstration of interactive systems. Students lead their own capstone projects where they build interactive systems and present their demonstrations at the last class.
This course is double-listed as “3747-108: Advanced Topics in HCI” in Graduate School of Engineering and “4915100: Human Interfaces” in Interfaculty Initiative in Information Studies, Graduate School of Interdisciplinary Information Studies. Students are allowed to register only to either of these two courses. Everything besides a course name is the same, so no worries about which one you should register. :)
この講義は工学系では「3747-108: ヒューマンコンピュータインタラクション特論」，情報学環・学際情報学府では「4915100: ヒューマンインタフェース」として提供されています．学生はどちらかの講義にしか登録できません．講義の名前以外はすべて同じですので，どちらかで登録していただければ結構です．:)
|Room||Fully online (using zoom)|
|Time||Mondays, 10:25-12:10 (including a short break)|
|Instructor||Koji Yatani (koji “at-mark” iis-lab.org)|
|Online Lecture Room||See the following shared doc for URL.|
I am sincerely sorry that some students were not able to access to the zoom room for this course due to the incorrect domain setting. We have fixed the issue.
For those who were not able to attend the class on 5th, please carefully read this course website and slide. You must submit your paper preference by 8th if you take this course.
This course has two major objectives: getting familiarized with classic and recent HCI research that well demonstrates novel interactive systems and applications, and prototyping an interactive system. To achieve them, this course offers a mixture of research discussions on HCI papers and capstone projects.
- Research discussions: We discuss a selected set of papers published at HCI and its relevant conferences, such as CHI, UIST, UbiComp, CSCW, and MobiSys. Each student will be asked to lead discussions at least once during the semester. This year, we mostly focus on various sensing technology and its interactive applications.
- Capstone projects: Students conduct a project to develop an interactive system using sensors and/or hardware.
Except the first and last classes, the rough class structure is as follows:
- A brief introduction from the instructor
- 35 mins * 2: Paper discussions
- 15 mins presentation by a discussion chair
- 20 mins discussion among students and the instructor
- Discussions for individual capstone projects
English is the official language in this course though Japanese may be used if necessary. All teaching is done in English at a class. Students are strongly recommended to deliver their presentations and demonstrations in English. You may use Japanese when you have large difficulties in communication, but you must always try your best to speak English.
We do not have any explicit prerequisite for this course, but students are expected to have:
- Basic knowledge and experience on HCI research,
- Programming skills and experience, and
- English communication skills.
But, the most important is, of course, your strong passion. :)
We have no tolerance to any type of academic misconducts, such as plagiarism, inappropriate citations, and fabrications. Examples are:
- Using others' ideas without appropriate citations and/or acknowledgements,
- Using codes and/or libraries without citing appropriately,
- Using source codes written by others without explicit permissions，and
- Making up data or system behavior for better-looking demonstration.
In case serious academic misconducts are found, we give following strong penalties depending on their significance.
- No mark for assignments where academic misconducts are found (Marked as zero. Marks are retracted if already given),
- No mark for all assignments that have been already submitted,
- No mark for all assignments that have been already submitted and prohibition to submit future assignments.
Please make sure that your reports and source codes do not cause misunderstandings.
Auditing students are welcome to join us. But I strongly recommend you to do a discussion chair even if you are just auditing. Also participate in discussions at the class. Just don't be a free rider. :)
Your performance in this course will be evaluated in the following criteria.
- [30%] Paper discussion: Given to your performance in leading discussions about the paper assigned to you from the reading list.
- [50%] Capstone project: Given to the quality of your project proposal (and prototype demonstration if you have any).
- [20%] Engagement and attendance: Given to your attendance to the course and your involvement in discussions during the class.
You must both do a discussion chair at least twice and complete your capstone project to get a final mark. Otherwise, your mark will be zero.
|#1||5, April||[Introduction]||Course introduction, reading assignment|
|[Research Discussions]||Quick overview of HCI research areas covered in this course|
|#2||19, April||[Research Discussions]||Sensing touch and gestures|
|#3||26, April||[Research Discussions]||On-body interaction|
|#4||10, May||[Research Discussions]||On-body actuation|
|#5||17, May||[Research Discussions]||Sensing your body with smartphones|
|#6||24, May||[Research Discussions]||Mixed reality|
|#7||31, May||[Capstone Project]||Project peer review|
|#8||7, June||[Research Discussions]||Wearable sensing|
|#9||14, June||[Research Discussions]||Sensing with smartwatches|
|#10||21, June||[Research Discussions]||Infrastructure-based sensing|
|#11||28, June||[Research Discussions]||Interacting with your mood|
|#12||5, July||[Capstone Project]||Project peer review|
|#13||12, July||[Capstone Project]||Project demo presentation|
Please submit your paper preference from the following Google Form page by 9th April. You need to log in with your ECCS account.
- Sensing touch and gestures
- [Zixiong Su] Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects in CHI 2012. http://dl.acm.org/citation.cfm?id=2207743
- Humantenna: using the body as an antenna for real-time whole-body interaction in CHI 2012. https://dl.acm.org/citation.cfm?id=2208330
- On-body interaction
- [Keitaro Shimizu] Skinput: appropriating the body as an input surface in CHI 2010. http://dl.acm.org/citation.cfm?id=1753394
- [Yongxiang Ma] Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor in UIST 2012. http://dl.acm.org/citation.cfm?id=2380139
- On-body actuation
- [Fan Zhongzhong] Affordance++: Allowing Objects to Communicate Dynamic Use in CHI 2015. http://dl.acm.org/citation.cfm?id=2702128
- [Nur Khairina Binti Khairu Najhan] FootStriker: An EMS-based Foot Strike Assistant for Running in IMWUT 2017. https://dl.acm.org/citation.cfm?id=3053332
- Sensing your body with smartphones
- [Shixian Geng] SpiroSmart: using a microphone to measure lung function on a mobile phone in UbiComp 2012. http://dl.acm.org/citation.cfm?id=2370261
- [Siyuan Zhang] Noninvasive Blood Screening of Hemoglobin using Smartphone Cameras in UbiComp 2016. http://dl.acm.org/citation.cfm?id=2971653
- Mixed reality
- [Naho Tomiki] IllumiRoom: peripheral projected illusions for interactive experiences in CHI 2013. http://dl.acm.org/citation.cfm?id=2466112
- [Hao Zhaoyu] Imaginary reality gaming: ball games without a ball. in UIST 2013. http://dl.acm.org/ft_gateway.cfm?id=2502012
- Wearable sensing
- [Jiaxuan Chen] SkullConduct: Biometric User Identification on Eyewear Computers Using Bone Conduction Through the Skull in CHI 2016. https://dl.acm.org/citation.cfm?id=2858152
- [Wanhui Li] BodyScope: a wearable acoustic sensorfor activity recognition in UbiComp 2012. https://dl.acm.org/citation.cfm?id=2370269
- Sensing with smartwatches
- [Chatziantoniou Nikolaos] Viband: High-fidelity bio-acoustic sensing using commodity smartwatch accelerometers in UIST 2016. https://dl.acm.org/citation.cfm?id=2984582
- [Anran Xu] A practical approach for recognizing eating moments with wrist-mounted inertial sensing in UbiComp 2015. https://dl.acm.org/citation.cfm?id=2807545
- Infrastructure-based sensing
- [Shitao Fang] At the Flick of a Switch: Detecting and Classifying Unique Electrical Events on the Residential Power Line in UbiComp 2007. https://link.springer.com/chapter/10.1007/978-3-540-74853-3_16
- [Cheng Muyao] FluSense: A Contactless Syndromic Surveillance Platform for Influenza-Like Illness in Hospital Waiting Areas in CHI 2015. https://dl.acm.org/doi/abs/10.1145/3381014
- Interacting with your mood
- [Hao Zheng] CrossCheck: Toward passive sensing and detection of mental health changes in people with schizophrenia in UbiComp 2016. http://dl.acm.org/citation.cfm?id=2971740
- [Akari Doi] EmotionCheck: leveraging bodily signals and false feedback to regulate our emotions in UbiComp 2016. https://dl.acm.org/citation.cfm?id=2971752
In research discussions, we discuss some of recently-published HCI work that demonstrates strong novelty and/or progress in this field. After the first class, please name your preferences in this page.
- Discussion chair: This person plays a central role of stimulating discussions among fellow students. You will have 20 - 25 minutes in total for your discussion slot. You must read the assigned paper carefully, and deliver a 10-minute presentation. After your presentation, you will be expected to lead discussions with fellow students. Your presentation material must be in English though you can deliver either in English or Japanese. Your presentation should cover:
- Backgrounds of the research,
- Summary of the developed system,
- Novelty and originality of the work, and
- pros and cons of the system/method.
- Discussion members: The rest of you will serve as discussion members. You must engage in discussions proactively. All of discussion members must read the papers before coming to the class. You should take notes about your impression on the papers, in particular:
- What did you like in this work? Why?
- How do you think this work can inspire your research?
- What are possible applications out of this technology?
- What are shortcomings? What improvements do you think this technology needs?
- If you were a reviewer on this paper, how would you rate and provide feedback?
- If you were a program committee member and had to pitch this paper to argue accept or reject, how would you do?
- What impressed you about the writing? What presentation techniques do you think we should learn from the paper?
In Class #1 , we discuss some vision videos. If you are interested in checking more videos, use the following links to find your favorites.
A capstone project aims to obtain experience of building an interactive system with hardware and/or artificial intelligence, and delivering a live demonstration and demonstrating at least one cool application scenario.
You are encouraged to collaborate with your fellow students and team up for capstone projects. However, your team should be up to three people. Marks for the capstone project will be given equally to all team members.
Your system must be interactive and use some kinds of hardware and/or artificial intelligence. You may use anything for your project. Your hardware can be smartphones, cameras, Kinect sensors, and/or what you build by yourselves.
You also must demonstrate at least one cool application scenario (excluding games) with your systems. Your application does not have to be large-scale or complex, but you have to demonstrate that your system would be something useful to potential users instead of just being cool.
You will be asked to do a live demonstration at the last class. So make sure that your final system works in real time. Your system will likely to perform some sort of recognition (heuristically or with machine learning). The recognition does not have to be super accurate, but it has to work reasonably well.
We do not care about what programming languages or environment you use. If you need suggestions or support, please consult with the instructor though we do not guarantee providing the stuff you want.
You must deliver the following items at the end of the course.
- Project presentation and live demonstration: Roughly 10 mins presentation. It must include a live demonstration of your system.
- Demonstration video: A video that shows a demonstration of your system. The video can be up to 5 minutes long. MP4 is strongly recommended, but a common video format (e.g., mpeg, avi, wmv, and mov) is also acceptable.
We evaluate your capstone projects in the following criteria:
- [15%] Originality: The uniqueness of your system in terms of the concept, design, and/or implementation.
- [10%] Implementation thoroughness: The quality of your implementation. Note that your implementation does not have to be super robust, but it should be demoable.
- [15%] Application scenario: The practicality of the application scenarios demonstrated.
- [10%] Presentation delivery: The quality of your presentation (including your live demo and video).
Examples of capstone projects are as follows (but not limited to):
- Recognizing user's activities from sensor data on a smartphone
- Detecting gesture input to support a new type of interaction with computers
- Detecting user's different types of physical exercise
- Creating new visual environments for entertainment
- Generating novel output modalities