Seminar

This seminar (Hauptseminar) is on human centered interaction with ubiquitous computing systems.

We focus on physiological sensing methods and the resulting novel interaction techniques.

Context

Digitalization has a massive impact on the way we live our lives and how we work. Many aspects of our society, such as communication, mobility, or money are transformed through ubiquitous computing technologies.

The interaction with our environment becomes increasingly ubiquitous.  Human-computer-interaction, interaction and communication with other people are more and more mediated through digital technologies.

Interaction technologies and how we design interaction are becoming the defining factors in our experience in the world. We need to better understand how to put the humans and their needs in the center of our advances.

Topics

In the seminar we look at current topics in human computer interaction in the context of ubiquitous media and computing, such as:

    • Muscle activity (EMG) as input for implicit and explicit interaction [KB] Jakob
      • T Scott Saponas, Desney S. Tan, Dan Morris, and Ravin Balakrishnan. 2008. Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’08). ACM, New York, NY, USA, 515-524. DOI: https://doi.org/10.1145/1357054.1357138
      • Enrico Costanza, Samuel A. Inverso, Rebecca Allen, and Pattie Maes. 2007. Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’07). ACM, New York, NY, USA, 819-828. DOI: https://doi.org/10.1145/1240624.1240747
      • Jonghwa Kim, Stephan Mastnik, and Elisabeth André. 2008. EMG-based hand gesture recognition for realtime biosignal interfacing. In Proceedings of the 13th international conference on Intelligent user interfaces (IUI ’08). ACM, New York, NY, USA, 30-39. DOI=http://dx.doi.org/10.1145/1378773.1378778
      • VRANA, S. R. (1993), The psychophysiology of disgust: Differentiating negative emotional contexts with facial EMG. Psychophysiology, 30: 279–286. doi:10.1111/j.1469-8986.1993.tb03354.x
      • Recognizing gestures from forearm EMG signals. D Tan, D Morris, S Saponas, R Balakrishnan. US Patent 8,447,704
  • Electrical muscle stimulation (EMS) as output modality [EG] Jakob
    • Emi Tamaki, Takashi Miyaki, and Jun Rekimoto. 2011. PossessedHand: techniques for controlling human hands using electrical muscles stimuli. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’11). ACM, New York, NY, USA, 543-552. DOI: https://doi.org/10.1145/1978942.1979018
    • Pedro Lopes, Doăa Yüksel, François Guimbretière, and Patrick Baudisch. 2016. Muscle-plotter: An Interactive System based on Electrical Muscle Stimulation that Produces Spatial Output. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST ’16). ACM, New York, NY, USA, 207-217. DOI: https://doi.org/10.1145/2984511.2984530
    • Max Pfeiffer, Tim Dünte, Stefan Schneegass, Florian Alt, and Michael Rohs. 2015. Cruise Control for Pedestrians: Controlling Walking Direction using Electrical Muscle Stimulation. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI ’15). ACM, New York, NY, USA, 2505-2514. DOI: https://doi.org/10.1145/2702123.2702190
    • https://github.com/PedroLopes/openEMSstim
  • Controlling computers using electrical signals from the brain [JM] Thomas
    • Lin, J. S., Chen, K. C., & Yang, W. C. (2010, May). EEG and eye-blinking signals through a Brain-Computer Interface based control for electric wheelchairs with wireless scheme. In New Trends in Information Science and Service Science (NISS), 2010 4th International Conference on (pp. 731-734). IEEE. Link: http://ieeexplore.ieee.org/abstract/document/5488522/
    • Wolpaw, J. R., McFarland, D. J., Neat, G. W., & Forneris, C. A. (1991). An EEG-based brain-computer interface for cursor control. Electroencephalography and clinical neurophysiology78(3), 252-259. Link: http://www.sciencedirect.com/science/article/pii/001346949190040B
    • Campbell, A., Choudhury, T., Hu, S., Lu, H., Mukerjee, M. K., Rabbi, M., & Raizada, R. D. (2010, August). NeuroPhone: brain-mobile phone interface using a wireless EEG headset. In Proceedings of the second ACM SIGCOMM workshop on Networking, systems, and applications on mobile handhelds (pp. 3-8). ACM. Link: https://dl.acm.org/citation.cfm?id=1851326
    • Sullivan, T., Delorme, A., & Luo, A. (2012). U.S. Patent No. 8,155,736. Washington, DC: U.S. Patent and Trademark Office. EEG control of devices using sensory evoked potentials Link: https://www.google.com/patents/US8155736
  • Recognizing emotions based on physiological sensors [LS] Thomas
    • McDuff, D., Karlson, A., Kapoor, A., Roseway, A., & Czerwinski, M. (2012, May). AffectAura: an intelligent system for emotional memory. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 849-858). ACM. Link: https://dl.acm.org/citation.cfm?id=2208525
    • Picard, R. W., Vyzas, E., & Healey, J. (2001). Toward machine emotional intelligence: Analysis of affective physiological state. IEEE transactions on pattern analysis and machine intelligence23(10), 1175-1191. Link: http://ieeexplore.ieee.org/abstract/document/954607/
    • Ayzenberg, Y., & Picard, R. W. (2014). FEEL: A system for frequent event and electrodermal activity labeling. IEEE journal of biomedical and health informatics18(1), 266-277. Link: http://ieeexplore.ieee.org/abstract/document/6579690/
    • Kim, J., & André, E. (2008). Emotion recognition based on physiological changes in music listening. IEEE transactions on pattern analysis and machine intelligence30(12), 2067-2083. Link: http://ieeexplore.ieee.org/abstract/document/4441720/
    • Matraszek, T. A., Fedorovskaya, E. A., Endrikhovski, S., & Parulski, K. A. (2014). U.S. Patent No. 8,630,496. Washington, DC: U.S. Patent and Trademark Office. Method for creating and using affective information in a digital imaging system Link: https://www.google.com/patents/US8630496
  • Detecting Emotions by Observing the Face [MR] Thomas Kosch 
    • McDuff, D., Mahmoud, A., Mavadati, M., Amr, M., Turcot, J., & Kaliouby, R. E. (2016, May). AFFDEX SDK: a cross-platform real-time multi-face expression recognition toolkit. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (pp. 3723-3726). ACM. (https://dl.acm.org/citation.cfm?id=2890247 )
    • McDuff, D., El Kaliouby, R., & Picard, R. W. (2015, September). Crowdsourcing facial responses to online videos. In Affective Computing and Intelligent Interaction (ACII), 2015 International Conference on (pp. 512-518). IEEE. (http://ieeexplore.ieee.org/abstract/document/7344618/ )
    • Feijó Filho, J., Valle, T., & Prata, W. (2014, September). Non-verbal communications in mobile text chat: emotion-enhanced mobile chat. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services (pp. 443-446). ACM. ( https://dl.acm.org/citation.cfm?id=2633576 )
    • Video indexing based on viewers’ behavior and emotion feedback (https://www.google.com/patents/US6585521 )
  • Interaction with information in augmented reality [CS] Pascal, Matthias
  • Haptic interaction in virtual reality [SM] Matthias
    • Azmandian, M., Hancock, M., Benko, H., Ofek, E., & Wilson, A. D. (2016, May). Haptic retargeting: Dynamic repurposing of passive haptics for enhanced virtual reality experiences. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems(pp. 1968-1979). ACM.
    • Haptic Feedback for Virtual Reality, Grigore C. Burdea http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.135.6358&rep=rep1&type=pdf 
    • Cheng, L. P., Ofek, E., Holz, C., Benko, H., & Wilson, A. D. (2017, May). Sparse Haptic Proxy: Touch Feedback in Virtual Environments Using a General Passive Prop. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 3718-3728). ACM.
  • Pointing and text input in virtual reality [AN] Pascal, Matthias
    • Poupyrev, I., Tomokazu, N., & Weghorst, S. (1998, March). Virtual Notepad: handwriting in immersive VR. In Virtual Reality Annual International Symposium, 1998. Proceedings., IEEE 1998 (pp. 126-132). IEEE. Link: http://ieeexplore.ieee.org/document/658467/
    • Walker, JWalker, J., Li, B., Vertanen, K., & Kuhl, S. (2017, May). Efficient Typing on a Visually Occluded Physical Keyboard. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 5457-5461). ACM. Link: https://dl.acm.org/citation.cfm?id=3025783
    • Mark McGill, Daniel Boland, Roderick Murray-Smith, and Stephen Brewster. 2015. A Dose of Reality: Overcoming Usability Challenges in VR Head-Mounted Displays. InProceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI ’15). ACM, New York, NY, USA, 2143-2152. DOI: https://doi.org/10.1145/2702123.2702382 
    • An  Evaluation  of  Techniques  for  Grabbing  and  Manipulating  Remote Objects  in  Immersive  Virtual  Environments Doug A. Bowman and Larry F. Hodgeshttp://lsc.univ-evry.fr/~davesne/ens/pub/grab.pdf

Tasks for event day

  • Proceedings (alles PDFs einsammeln, copyright check, Inhaltsverzeichnis bauen, Gesamtdokument erstellen, Titelblatt machen, evtl. auf Amazon als E-Book stellen):
    • SM
    • LS
    • EG
    • KB
  • Publicity (Webseite mit Zeitplan, Werbung für die Veranstaltung über Facebook, Twitter):
    • JM
    • CS
    • MR (Praktikum)
  • Poster bei der Veranstaltung: ein Poster mit allen Postkarten und Themen Tweets zum Seminar und ein Poster zum Praktikum , ein Poster mit Zeitplan und Teilnehmern, Posterständer für die Praktikumsergebnisse (insgesamt 6 Poster, wobei 3 von den Personen im Praktikum gemacht werden):
    • AT
    • MR
    • LS (Praktikum)
  • Organisation der Demo-Anforderungen für die 3 Praktikumsteilnehmer und Planung der Ausstellung:
    • DS (Praktikum)

Timeplan

    • 02.11.17 (session) initial meeting, discussion of topics, procedure
    • 16.11.17 (session) 60-second talks introducing the topic (each student)
    • 23.11.17 (submission) table of contents for the talk (+ title) and 120 character tweet (one slide each as pdf) to seminarws1718@hcilab.org.
    • 23.11.17 (session): discussion of submissions and flyer creation.
    • 11.01.18 (submission & session) submission of first version of the written text, explanation of how to review
    • test talks in groups … sometime between 11.01.18 bis 24.01.18 (submit your video)
    • 25.01.18 (session) public presentations from 13:00 to 18:00
    • 08.02.18 (submission) of the final version of the paper, in German (6 to 8 page, format will be given)

For details look at the German seminar page.