User Tools

Site Tools


You are not allowed to perform this action
2013falltopicshci:start

Advanced Topics in HCI: Mobile and Ubiquitous Computing (Fall - Winter 2013)


This course is Creative Informatics Industrial Collaboration Program VI (創造情報学連携講義VI) in Graduate School of Information Science and Technology. If you want to take this course for credit, please make sure to do registration (contact the department for information about how to register).


This course has two objectives. It covers representative work and recent trends of user interfaces and interactive systems for mobile and ubiquitous computing. The course also offers exercises on conducting HCI research to identify a problem space, collect ideas through brainstorming, conduct surveys, present ideas, and write a grant proposal. These exercises are done through team work.

Each class will consist of two parts. The first 60 - 70 minutes will cover a research topic in mobile and ubiquitous computing. The rest 20 - 30 minutes will be a tutorial to offer design methods used in HCI research or cover some tips on skills important for research, such as presentations and writing. Assignments are designed to exercise knowledge and techniques that you learn during the classes.

This course will be offered in English. However, assignments may be submitted in either English or Japanese.

Instructor Koji Yatani (矢谷浩司)
Teaching Assistant Genki Furumi (古見元気)
Dates Dec. 16, Monday - Dec. 20, Monday
Lecture Room Room 244, Building 2, Hongo Campus
Instructor's office TBD

Syllabus

Date/Time Class Topic (60 - 70 mins) & Research skill tutorial (20 - 30 mins)
Dec. 16, Monday
16:40 – 18:10
Class #1 [Class Topic] Introduction, Wearable output ( PDF)
Dec. 16, Monday
18:30 – 20:00
Class #2 [Class topic] Wearable input ( PDF)
[Research skill tutorial] Brainstorming methods and principles ( PDF)
Dec. 16, Monday
23:59
Assignment A due
Dec. 17, Tuesday
16:40 – 18:10
Class #3 [Class topic] Activity recognition with wearable sensors ( PDF)
[Research skill tutorial] Surveys ( PDF)
Dec. 18, Wednesday
16:40 – 18:10
Class #4 [Class topic] Interaction on ultra-small devices ( PDF)
[Research skill tutorial] Low-fidelity prototyping and testing ( PDF)
Dec. 18, Wednesday
18:30 – 20:00
Class #5 [Class topic] Mobile devices as a sensing platform ( PDF)
[Research skill tutorial] Presentations: Do's and don'ts ( PDF)
Dec. 18, Wednesday
23:59
Assignment B due
Dec. 19, Thursday
16:40 – 18:10
Class #6 [Class topic] Infrastructure-based sensing ( PDF)
[Research skill tutorial] Writing: Do's and don'ts ( PDF)
Dec. 20, Friday
16:40 – 18:10
Class #7 [Class topic] Final presentation, Concluding remarks ( PDF)
Dec. 20, Friday
During the class
Assignment C due
Dec. 27, Friday
23:59
Assignment D, and Course Survey due



Assignments

If you are taking this course for credit, you must be registered, and must submit the following assignments. These assignments may be submitted in either English or Japanese.


Plagiarism

If any clear instance of plagiarism (e.g., direct copy-and-paste from a paper, book, other assignment submissions or any other material without any citation) is spotted, the person or team (all team members) who are responsible for the submission will automatically receive a zero total score for this course regardless of their performance. The incident will also be immediately reported to the department.

Also read the guideline by Graduate School of Information Science and Technology, The University of Tokyo. http://www.i.u-tokyo.ac.jp/edu/others/pdf/guideline_ja.pdf

If you are unsure about how to cite or quote, please feel free to ask the instructor. You should contact him well in advance before the assignment deadline.


Late submissions

Late submissions may be accepted if they satisfy the following conditions, with a penalty based on the duration.

  • Late under 24 hours: A submission may be accepted with 25% deduction from the total score.
  • Late under 48 hours: A submission may be accepted with 50% deduction from the total score.
  • Late over 48 hours: A submission will not be accepted.

If your team has a legitimate justification for your late submissions (e.g., with doctor's notes), they can be accepted without any penalty. If you are not sure, please contact the instructor before the deadline.


Office hours

Office hours are intended to have advice from the instructor or teaching assistant on your assignments. You may bring things you think problematic or are not clear to you, but the instructor and teaching assistant will not directly help your assignment in any way (e.g., writing up part of the assignment). Treat them as more like an external consultant or advisor on your work.

Office hours are available only by appointment. If you feel you need to talk to the instructor or teaching assistant, send them an email in 24 hours before your desired appointment time. This is necessary to make sure that they are available for you. Most likely, office hours would be before the class (so somewhere between 3 and 4 pm). Discussions with the instructor and teaching assistant can be either in English or Japanese.


Assignment A: Team formation and introduction [5 points]

Form a team of 3 - 5 people. Try to have people with different backgrounds and skills because they would be able to offer different contributions in collaboration.

After forming your team, let me know about your team. Send me the following information:

  • About the team
    • Team name
  • About each team member
    • Your name
    • Year of your course
    • Your department
    • Your research interests
    • Your expertise and skills
    • Why did you decide to take this course?

Download this form for Assignment A. Fill it out with your team members and submit its PDF version to me. If you do not have a way to make a PDF file, you may submit your assignment as a Word file.

The subject of your email must be “Assignment A ([your team name])”.


Assignment B: Conduct brainstorming [25 points]

In this assignment, you discuss with your team members mobile and ubiquitous systems or applications you would want to build, and conduct a brainstorming session.

As a project theme, we pick up applications and services with glass-like devices. Glass-like devices, such as head-mounted displays, have been investigated for decades, and recently has started to be commercialized (e.g., Google Glass). Although devices now become available to the public, it is still unclear how people would use such devices. This and subsequent assignments will examine what applications and services these devices would enable and how they could influence our life. For example, suppose you are doing a start-up on these glass-like devices. Now you want to build applications or services which people would love and keep using. Or you may want to create some kinds of accessories for Google Glass which would offer unique interaction capabilities. Then what would that be? What would Google Glass users want to buy (beyond basic applications or accessories)?

First, check out recent glass-like devices (wearable see-through displays).

Discuss with your team members various ideas and use scenarios with glass-like devices. For example,

  • What applications with glass-like devices would help you perform particular tasks?
  • What interaction would enhance the user experience of glass-like devices? Remember on-body or around-body sensing technologies we discussed in the classes. How could they be integrated and could be used?
  • What output modalities beyond visual would make a difference in glass-like devices?
  • How could glass-like devices change our current communication behavior? Compared to mobile phones or other communication terminals, how could they provide improved or enriched communication between people?
  • How could these devices help various user populations, such as people with impairments, elderly people or children?
  • How could we mitigate security and privacy issues around glass-like devices?

Conduct a brainstorming session to explore possible solutions. You are encouraged to follow the brainstorming principles I explained during the class.

Try to come up with at least [the number of team members] * 3 ideas/scenarios (e.g., if your team consists of four members, your team must present at least 12 ideas/scenarios. List up the ideas in the assignment. We do not count identical or very similar ideas (e.g., telling you locations and directions of new restaurants nearby and telling you locations and directions of convenience stores nearby where you can buy a bottle of water). Ideas should also be expected to be feasible in the near future. Ideas which have substantial overlaps with others or which are not clearly feasible would result in score deduction.

When the brainstorming is done, discuss all ideas you have with your team members, and decide the the best ideas.

In the assignment, you must report:

  • How the brainstorming session was conducted,
  • What ideas you came up with,
  • What you guys agreed as the best idea, and why.

Download this form for Assignment B. Fill it out with your team members and submit its PDF version to me. If you do not have a way to make a PDF file, you may submit your assignment as a Word file.

The subject of your email must be “Assignment B ([your team name])”.


Assignment C: Presentation [30 points]

In this assignment, you must create a presentation of your project and pitch your idea to other students.

Presentation is a very important activity for researchers to communicate and discuss ideas with others. This assignment is intended to offer an exercise on preparing a presentation material and doing a presentation. You (as a team) will be asked to give a presentation during the class.

Your presentation must cover the following points:

  • Motivation of your project: What problems are you trying to solve? Why are they so important? Who are the target users?
  • Solution proposal: How are you going to solve this problem?
  • Prior work: What are a few examples of representative prior work? Show us at least three research projects that are closely related to your idea, including one that was discussed in this course.
  • Development plan: How will you build your system/application? Explain potential technical challenges as well.

You may use any type of presentation software (e.g., Powerpoint, Keynote, Prezi, etc). But please make sure that your presentation is playable with the projector and screen at the classroom. You should double-check this by yourself before the class starts. Also bring a laptop or tablet for your presentation (maybe bringing another machine for back-up would be a good idea).

Each team presentation will be evaluated in the following criteria:

  • Content quality (8 points): How well did your presentation cover the points specified in the assignment?
  • Presentation material quality (8 points): How effectively did the presenters use visual materials? Too much text would be a deduction.
  • Speech quality (8 points): How smooth was the speech? Was it easy for the audience to follow? Did you speak to the audience rather than keep looking at the projected screen or laptop?
  • Equality of workload in the presentation (6 points): Each of the team member should be responsible for presenting part of the project. Make sure that you divide the presentation workload equally across all the team members.

Each of the team members must present at least one section of the presentation. You must prepare both your presentation material in English. You may present in either Japanese or English though English is strongly recommended. I understand that some of you are not English-native speakers, and this is why the evaluation does not consider your low-level English ability (e.g., pronunciation). Think well about how you can communicate your project in the vocabulary you have. I am not looking for very sophisticated expressions or poetic performance. Rather, try hard to communicate in a way that the audience can understand your idea quickly.

Each presentation is 12 minutes long, followed by a 3-minute Q&A session. Thus, 10 - 15 slides would suffice if you use PowerPoint or Keynote. Incorporate the tips I presented during the class into your presentation. For example, too much text or unnecessary visual effects in a slide would be detrimental to the score of “Presentation material quality”.

There is a penalty for under-run (shorter than 10 minutes) or over-run (longer than 14 minutes). The amount of the deduction will be determined based on how much under-run or over-run your presentation is.

After the class, please send me all the materials you used for your presentations. You may provide its copy to me after the class.


Assignment D: Grant proposal [40 points]

In this assignment, you must write up a grant proposal for your project. A grant proposal is as important as a publication because it is the main way for researchers to obtain funding. This assignment is intended to offer an exercise on this important activity as a researcher.

Your team is now going to apply for a one-year project grant proposal. It will provide a relatively small funding (15k - 30k USD; 1.5M - 3M JPY), but this would be sufficient to conduct an initial phase of research (e.g., creating a working proof-of-concept and attending a conference to present the work).

Download this form for Assignment D. Fill it out with your team members and submit its PDF version to me. If you do not have a way to make a PDF file, you may submit your assignment as a Word file. In the form, you must fill out:

  • Project title
  • Project members
  • Abstract (150 - 300 words)
  • Descriptions
    • Problems to be investigated, and its social and technical importance
    • Proposed solutions to solve the problems
    • Research approaches and implementation methods
    • Include diagrams or algorithms if appropriate
  • Expected outcomes and deliverable
  • Project schedule
  • Estimated budget (in USD or JPY)
  • Previous and related work

The proposal must be at least 4 pages, but no more than 10 pages.

The subject of your email must be “Assignment D ([your team name])”.



Course Survey

Evaluation and feedback to the course and instructor [Optional but greatly appreciated]

It is really important for me to have honest feedback from you guys. This is always a great opportunity for me to learn as an instructor and understand how a course like this can be improved in the future.

To collect your honest opinion, I have made my survey anonymized. You do not have to reveal your identity unless you really want to. I also promise that this survey does not affect your course score in any way.

If you are willing to offer your feedback to me, click this link to complete the survey. There are only a few questions, and it will take only 10 - 15 minutes (I believe).

The survey results may be shared with other professors at the same university. The survey is anonymized, but I will ensure that all identifiable information is removed before sharing.



References & Readings

Class References
Class #1 Weiser, Mark. “The computer for the 21st century.” Scientific american 265.3 (1991): 94-104.
Mann, Steve (2013): Wearable Computing. In: Soegaard, Mads and Dam, Rikke Friis (eds.). “The Encyclopedia of Human-Computer Interaction, 2nd Ed.”
Sutherland, Ivan E. “A head-mounted three dimensional display.” Proceedings of the December 9-11, 1968, fall joint computer conference, part I. ACM, 1968.
Rekimoto, Jun, and Katashi Nagao. “The world through the computer: Computer augmented interaction with real world environments.” Proceedings of the 8th annual ACM symposium on User interface and software technology. ACM, 1995.
Feiner, Steven, Blair Macintyre, and Dorée Seligmann. “Knowledge-based augmented reality.” Communications of the ACM 36.7 (1993): 53-62.
Henderson, Steven J., and Steven K. Feiner. “Augmented reality in the psychomotor phase of a procedural task.” Mixed and Augmented Reality (ISMAR), 2011 10th IEEE International Symposium on. IEEE, 2011.
Rhodes, Bradley J. “The wearable remembrance agent: A system for augmented memory.” Personal Technologies 1.4 (1997): 218-224.
Lyons, Kent, et al. “Facet: a multi-segment wrist worn system.” Proceedings of the 25th annual ACM symposium on User interface software and technology. ACM, 2012.
Harrison, Chris, Hrvoje Benko, and Andrew D. Wilson. “OmniTouch: wearable multitouch interaction everywhere.” Proceedings of the 24th annual ACM symposium on User interface software and technology. ACM, 2011.
Tan, Hong Z., and Alex Pentland. “Tactual displays for wearable computing.” Personal Technologies 1.4 (1997): 225-230.
Tsukada, Koji, and Michiaki Yasumura. “Activebelt: Belt-type wearable tactile display for directional navigation.” UbiComp 2004: Ubiquitous Computing. Springer Berlin Heidelberg, 2004. 384-399.
Huang, Kevin, et al. “Mobile music touch: mobile tactile stimulation for passive learning.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2010.
Tamaki, Emi, Takashi Miyaki, and Jun Rekimoto. “PossessedHand: techniques for controlling human hands using electrical muscles stimuli.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2011.
Yamada, Tomoya, et al. “Wearable olfactory display: Using odor in outdoor environment.” Virtual Reality Conference, 2006. IEEE, 2006.
Electricfoxy talks with Asta Roseway at Microsoft Research. http://vimeo.com/26806334
Class #2 Lyons, Kent, et al. “Twiddler typing: One-handed chording text entry for mobile phones.” Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 2004.
Sawhney, Nitin, and Chris Schmandt. “Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments.” ACM transactions on Computer-Human interaction (TOCHI) 7.3 (2000): 353-383.
Morris, Dan, T. Scott Saponas, and Desney Tan. “Emerging input technologies for always-available mobile interaction.” Foundations and Trends in Human–Computer Interaction 4.4 (2010): 245-316.
Fukumoto, Masaaki, and Yasuhito Suenaga. ““FingeRing”: a full-time wearable interface.” Conference companion on Human factors in computing systems. ACM, 1994.
Fukumoto, Masaaki, and Yoshinobu Tonomura. ““Body coupled FingerRing”: wireless wearable keyboard.” Proceedings of the ACM SIGCHI Conference on Human factors in computing systems. ACM, 1997.
Rekimoto, Jun. “Gesturewrist and gesturepad: Unobtrusive wearable interaction devices.” Wearable Computers, 2001. Proceedings. Fifth International Symposium on. IEEE, 2001.
Kim, Jungsoo, et al. “The gesture watch: A wireless contact-free gesture based wrist interface.” Wearable Computers, 2007 11th IEEE International Symposium on. IEEE, 2007.
Starner, Thad, et al. “The gesture pendant: A self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring.” Wearable Computers, The Fourth International Symposium on. IEEE, 2000.
Mistry, Pranav, Pattie Maes, and Liyan Chang. “WUW-wear Ur world: a wearable gestural interface.” CHI'09 extended abstracts on Human factors in computing systems. ACM, 2009.
Gustafson, Sean, Daniel Bierwirth, and Patrick Baudisch. “Imaginary interfaces: spatial interaction with empty hands and without visual feedback.” Proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 2010.
Kim, David, et al. “Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor.” Proceedings of the 25th annual ACM symposium on User interface software and technology. ACM, 2012.
Bailly, Gilles, et al. “ShoeSense: a new perspective on gestural interaction and wearable applications.” Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems. ACM, 2012.
Yang, Xing-Dong, et al. “Magic finger: always-available input through finger instrumentation.” Proceedings of the 25th annual ACM symposium on User interface software and technology. ACM, 2012.
Saponas, T. Scott, et al. “Enabling always-available input with muscle-computer interfaces.” Proceedings of the 22nd annual ACM symposium on User interface software and technology. ACM, 2009.
Harrison, Chris, Desney Tan, and Dan Morris. “Skinput: appropriating the body as an input surface.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2010.
Solovey, Erin Treacy, et al. “Using fNIRS brain sensing in realistic HCI settings: experiments and guidelines.” Proceedings of the 22nd annual ACM symposium on User interface software and technology. ACM, 2009.
Manabe, Hiroyuki, and Masaaki Fukumoto. “Full-time wearable headphone-type gaze detector.” CHI'06 extended abstracts on Human factors in computing systems. ACM, 2006.
Saponas, T. Scott, et al. “Optically sensing tongue gestures for computer input.” Proceedings of the 22nd annual ACM symposium on User interface software and technology. ACM, 2009.
Sato, Munehiko, Ivan Poupyrev, and Chris Harrison. “Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2012.
Kim, Dae-Hyeong, et al. “Epidermal electronics.” Science 333.6044 (2011): 838-843.
Holz, Christian, et al. “Implanted user interfaces.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2012.
Class #3 Ashbrook, Daniel, and Thad Starner. “Learning significant locations and predicting user movement with GPS.” Wearable Computers, 2002.(ISWC 2002). Proceedings. Sixth International Symposium on. IEEE, 2002.
Liao L., Fox D. and Kautz H. Location-based activity recognition. In Proceedings of Advances in Neural information Processing Systems, 2005.
Farringdon, Jonny, et al. “Wearable sensor badge and sensor jacket for context awareness.” The Third International Symposium on Wearable Computers, IEEE, 1999.
Van Laerhoven, Kristof, and Ozan Cakmakci. “What shall we teach our pants?.” Wearable Computers, The Fourth International Symposium on. IEEE, 2000.
Bao, Ling, and Stephen S. Intille. “Activity recognition from user-annotated acceleration data.” Pervasive Computing. Springer Berlin Heidelberg, 2004. 1-17.
Fan, Mingming, et al. “Augmenting gesture recognition with erlang-cox models to identify neurological disorders in premature babies.” Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM, 2012.
Van Laerhoven, Kristof, and H-W. Gellersen. “Spine versus porcupine: A study in distributed wearable activity recognition.” Wearable Computers, 2004. ISWC 2004. Eighth International Symposium on. Vol. 1. IEEE, 2004.
Patterson, Donald J., et al. “Fine-grained activity recognition by aggregating abstract object usage.” Wearable Computers, 2005. Proceedings. Ninth IEEE International Symposium on. IEEE, 2005.
Clarkson, Brian, Kenji Mase, and Alex Pentland. “Recognizing user context via wearable sensors.” Wearable Computers, The Fourth International Symposium on. IEEE, 2000.
Maekawa, Takuya, et al. “Object-based activity recognition with heterogeneous sensors on wrist.” Pervasive Computing. Springer Berlin Heidelberg, 2010. 246-264.
Maekawa, Takuya, et al. “Recognizing handheld electrical device usage with hand-worn coil of wire.” Pervasive Computing. Springer Berlin Heidelberg, 2012. 234-252.
Bulling, Andreas, et al. “Eye movement analysis for activity recognition.” Proceedings of the 11th international conference on Ubiquitous computing. ACM, 2009.
Ishiguro, Yoshio, et al. “Aided eyes: eye activity sensing for daily life.” Proceedings of the 1st Augmented Human International Conference. ACM, 2010.
Peltonen, Vesa, et al. “Computational auditory scene recognition.” Acoustics, Speech, and Signal Processing (ICASSP), 2002 IEEE International Conference on. Vol. 2. IEEE, 2002.
Lyons, Kent, et al. “Augmenting conversations using dual-purpose speech.” Proceedings of the 17th annual ACM symposium on User interface software and technology. ACM, 2004.
Amft, Oliver, et al. “Analysis of chewing sounds for dietary monitoring.” UbiComp 2005: Ubiquitous Computing. Springer Berlin Heidelberg, 2005. 56-72.
Yatani, Koji, and Khai N. Truong. “BodyScope: a wearable acoustic sensor for activity recognition.” Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM, 2012.
Li, Cheng-Yuan, et al. “Sensor-embedded teeth for oral activity recognition.” Wearable Computers, 2013. Proceedings. IEEE International Symposium on. IEEE, 2013.
Cheng, Jingyuan, Oliver Amft, and Paul Lukowicz. “Active capacitive sensing: Exploring a new wearable sensing modality for activity recognition.” Pervasive Computing. Springer Berlin Heidelberg, 2010. 319-336.
Natarajan, Annamalai, et al. “Detecting cocaine use with wearable electrocardiogram sensors.” Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing. ACM, 2013.
Lukowicz, Paul, et al. “Recognizing workshop activity using body worn microphones and accelerometers.” Pervasive Computing. Springer Berlin Heidelberg, 2004. 18-32.
Class #4 Hinckley, Ken, et al. “Sensing techniques for mobile interaction.” Proceedings of the 13th annual ACM symposium on User interface software and technology. ACM, 2000.
Yee, Ka-Ping. “Peephole displays: pen interaction on spatially aware handheld computers.” Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 2003.
Partridge, Kurt, et al. “TiltType: accelerometer-supported text entry for very small devices.” Proceedings of the 15th annual ACM symposium on User interface software and technology. ACM, 2002.
Vogel, Daniel, and Patrick Baudisch. “Shift: a technique for operating pen-based interfaces using touch.” Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 2007.
Yatani, Koji, et al. “Escape: a target selection technique using visually-cued gestures.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2008.
Wigdor, Daniel, et al. “Lucid touch: a see-through mobile device.” Proceedings of the 20th annual ACM symposium on User interface software and technology. ACM, 2007.
Baudisch, Patrick, and Gerry Chu. “Back-of-device interaction allows creating very small touch devices.” Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 2009.
Butler, Alex, Shahram Izadi, and Steve Hodges. “SideSight: multi-touch interaction around small devices.” Proceedings of the 21st annual ACM symposium on User interface software and technology. ACM, 2008.
Harrison, Chris, and Scott E. Hudson. “Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices.” Proceedings of the 22nd annual ACM symposium on User interface software and technology. ACM, 2009.
Harrison, Chris, and Scott E. Hudson. “Minput: enabling interaction on small mobile devices with high-precision, low-cost, multipoint optical tracking.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2010.
Perrault, Simon T., et al. “Watchit: simple gestures and eyes-free interaction for wristwatches and bracelets.” Proceedings of the 2013 ACM annual conference on Human factors in computing systems. ACM, 2013.
Ashbrook, Daniel, Patrick Baudisch, and Sean White. “Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2011.
Chen, Ke-Yu, et al. “uTrack: 3D input using two magnetic sensors.” Proceedings of the 26th annual ACM symposium on User interface software and technology. ACM, 2013.
Su, Chao-Huai, et al. “NailDisplay: bringing an always available visual display to fingertips.” Proceedings of the 2013 ACM annual conference on Human factors in computing systems. ACM, 2013.
Class #5 Lane, Nicholas D., et al. “A survey of mobile phone sensing.” Communications Magazine, IEEE 48.9 (2010): 140-150.
Khan, W., et al. “Mobile phone sensing systems: A survey.” IEEE COMMUNICATIONS SURVEYS & TUTORIALS (2012): 1-26.
Consolvo, Sunny, et al. “Activity sensing in the wild: a field trial of ubifit garden.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2008.
Choudhury, Tanzeem, et al. “The mobile sensing platform: An embedded activity recognition system.” Pervasive Computing, IEEE 7.2 (2008): 32-41.
Lu, Hong, et al. “SoundSense: scalable sound sensing for people-centric applications on mobile phones.” Proceedings of the 7th international conference on Mobile systems, applications, and services. ACM, 2009.
Scott, Jeremy, et al. “Sensing foot gestures from the pocket.” Proceedings of the 23nd annual ACM symposium on User interface software and technology. ACM, 2010.
Yang, Xing-Dong, et al. “Surround-see: enabling peripheral vision on smartphones during active use.” Proceedings of the 26th annual ACM symposium on User interface software and technology. ACM, 2013.
Patel, Shwetak N., et al. “Farther than you may think: An empirical investigation of the proximity of users to their mobile phones.” UbiComp 2006: Ubiquitous Computing. Springer Berlin Heidelberg, 2006. 123-140.
Dey, Anind K., et al. “Getting closer: an empirical investigation of the proximity of user to their smart phones.” UbiComp 2011: Ubiquitous Computing. 2011.
Burke, Jeffrey A., et al. “Participatory sensing.“ Workshop on World-Sensor-Web (2006).
Eagle, Nathan, and Alex Pentland. “Reality mining: sensing complex social systems.” Personal and ubiquitous computing 10.4 (2006): 255-268.
Mun, Min, et al. “PEIR, the personal environmental impact report, as a platform for participatory sensing systems research.” Proceedings of the 7th international conference on Mobile systems, applications, and services. ACM, 2009.
Miluzzo, Emiliano, et al. “CenceMe–injecting sensing presence into social networking applications.” Smart Sensing and Context. Springer Berlin Heidelberg, 2007. 1-28.
Hayes, Gillian R., et al. “The personal audio loop: Designing a ubiquitous audio-based memory aid.” Mobile Human-Computer Interaction-MobileHCI 2004. Springer Berlin Heidelberg, 2004. 168-179.
Rana, Rajib Kumar, et al. “Ear-phone: an end-to-end participatory urban noise mapping system.” Proceedings of the 9th ACM/IEEE International Conference on Information Processing in Sensor Networks. ACM, 2010.
Thiagarajan, Arvind, et al. “VTrack: accurate, energy-aware road traffic delay estimation using mobile phones.” Proceedings of the 7th ACM Conference on Embedded Networked Sensor Systems. ACM, 2009.
Mohan, Prashanth, Venkata N. Padmanabhan, and Ramachandran Ramjee. “Nericell: rich monitoring of road and traffic conditions using mobile smartphones.” Proceedings of the 6th ACM conference on Embedded network sensor systems. ACM, 2008.
Poh, Ming-Zher, Daniel J. McDuff, and Rosalind W. Picard. “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation.“ Optics Express, (2010).
Larson, Eric C., et al. “SpiroSmart: using a microphone to measure lung function on a mobile phone.” Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM, 2012.
Class #6 Kidd, Cory D., et al. “The aware home: A living laboratory for ubiquitous computing research.” Cooperative buildings. Integrating information, organizations, and architecture. Springer Berlin Heidelberg, 1999. 191-198.
Want, Roy, et al. “The active badge location system.” ACM Transactions on Information Systems (TOIS) 10.1 (1992): 91-102.
Bahl, Paramvir, and Venkata N. Padmanabhan. “RADAR: An in-building RF-based user location and tracking system.” INFOCOM 2000. Nineteenth Annual Joint Conference of the IEEE Computer and Communications Societies. IEEE. Vol. 2., 2000.
Wilson, Joey, and Neal Patwari. “Radio tomographic imaging with wireless networks.” Mobile Computing, IEEE Transactions on 9.5 (2010): 621-632.
Rekimoto, Jun, Takashi Miyaki, and Takaaki Ishizawa. “LifeTag: WiFi-based continuous location logging for life pattern analysis.” LoCA. 2007.
Pu, Qifan, et al. “Whole-home gesture recognition using wireless signals.” Proceedings of the 19th annual international conference on Mobile computing & networking. ACM, 2013.
Patel, Shwetak N., Khai N. Truong, and Gregory D. Abowd. “Powerline positioning: A practical sub-room-level indoor location system for domestic use.” UbiComp 2006: Ubiquitous Computing. Springer Berlin Heidelberg, 2006. 441-458.
Gupta, Sidhant, Matthew S. Reynolds, and Shwetak N. Patel. “ElectriSense: single-point sensing using EMI for electrical event detection and classification in the home.” Proceedings of the 12th ACM international conference on Ubiquitous computing. ACM, 2010.
Cohn, Gabe, et al. “Your noise is my command: sensing gestures using the body as an antenna.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2011.
Cohn, Gabe, et al. “Humantenna: using the body as an antenna for real-time whole-body interaction.” Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems. ACM, 2012.
Cohn, Gabe, et al. “GasSense: Appliance-level, single-point sensing of gas activity in the home.” Pervasive Computing. Springer Berlin Heidelberg, 2010. 265-282.
Patel, Shwetak N., Matthew S. Reynolds, and Gregory D. Abowd. “Detecting human movement by differential air pressure sensing in HVAC system ductwork: An exploration in infrastructure mediated sensing.” Pervasive Computing. Springer Berlin Heidelberg, 2008. 1-18.
Gupta, Sidhant, et al. “LightWave: using compact fluorescent lights as sensors.” Proceedings of the 13th international conference on Ubiquitous computing. ACM, 2011.
Scott, James, et al. “PreHeat: controlling home heating using occupancy prediction.” Proceedings of the 13th international conference on Ubiquitous computing. ACM, 2011.
Research skill tutorials How to use the Disney method. http://www.youtube.com/watch?v=hE2fZYTdIqA
How not to brainstorm. http://www.youtube.com/watch?v=ttWhK-NO4g8
Rettig, Marc. “Prototyping for tiny fingers.” Communications of the ACM 37.4 (1994): 21-27.
Hanmail paper prototype. http://www.youtube.com/watch?v=GrV2SZuRPv0
Maverick Innovation Lab and Price Engineering “Think-Aloud Testing”. http://www.youtube.com/watch?v=K1CgAjgUM0g
Microsoft Productivity Future Vision (2011). http://www.youtube.com/watch?v=a6cNdhOKwi0
Hartley, James. Academic writing and publishing: A practical handbook. Routledge, 2008. http://gate.ac.uk/sale/dd/related-work/Academic+Writing+and+Publishing+-+A+Practical+Handbook.pdf
Writing Research Papers (by Aaron Hertzmann). http://www.dgp.toronto.edu/~hertzman/advice/writing-technical-papers.pdf
How to write a research paper (by David R. Caprette). http://www.ruf.rice.edu/~bioslabs/tools/report/reportform.html
Writing Scientific Papers. http://www.nature.com/scitable/ebooks/english-communication-for-scientists-14053993/writing-scientific-papers-14239285
Reynolds, Garr. Presentation Zen: Simple ideas on presentation design and delivery. New Riders, 2011.
Life After Death by PowerPoint 2012 by Don McMillan. http://www.youtube.com/watch?v=MjcO2ExtHso
Additional readings Krumm, John, ed. Ubiquitous computing fundamentals. CRC Press, 2009.
Abowd, Gregory D., and Elizabeth D. Mynatt. “Charting past, present, and future research in ubiquitous computing.” ACM Transactions on Computer-Human Interaction (TOCHI) 7.1 (2000): 29-58.
Weiser, Mark, Rich Gold, and John Seely Brown. “The origins of ubiquitous computing research at PARC in the late 1980s.” IBM systems journal 38.4 (1999): 693-696.
Edwards, W. Keith, and Rebecca E. Grinter. “At home with ubiquitous computing: seven challenges.” Ubicomp 2001: Ubiquitous Computing. Springer Berlin Heidelberg, 2001.
Amft, Oliver, and Paul Lukowicz. “From backpacks to smartphones: past, present, and future of wearable computers.” Pervasive Computing, IEEE 8.3 (2009): 8-13.
Sellen, Abigail J., and Steve Whittaker. “Beyond total capture: a constructive critique of lifelogging.” Communications of the ACM 53.5 (2010): 70-77.
Abowd, Gregory D. “What next, ubicomp?: celebrating an intellectual disappearing act.” Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM, 2012.


2013falltopicshci/start.txt · Last modified: 2013/12/28 00:28 by Koji Yatani

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki