Rob Jacob talks with guest editors Giulio Jacucci, Stephen Fairclough, and Erin T. Solovey about how advancements in physiological computing might someday blur the distinction of where our bodies end and our computers begin. From Computer's October 2015 issue: http://www.computer.org/csdl/mags/co/2015/10/index.html.
Keynote Title: Past and Future of Physiological Computing and Creativity - An Underexplored and Promising Territory Keynote Lecturer: Sergi Jorda Keynote Chair: Hugo Plácido da Silva Presented on: 13/02/2015, ESEO, Angers, Loire Valley, France Abstract: We humans are highly expressive beings, and not only with respect to language; non-verbal communication has and will forever play an essential role in all human relationships. It is theorized that the human sclera, the "white of the eye", is unique in the animal kingdom in that it is visible whenever the eye is open. It has evolved to be this way because of our social nature, making it easier for one individual to infer where another one is looking, increasing the efficacy of this form of non-verbal communication and turning the eye from a sensory organ into a powerful communication tool. Beyond the eyes, our whole human body is a major source for non-verbal expressiveness, both conscious and unconscious. We humans are also highly creative beings. We have created language, music, art... and technology - and the technology we invent advances very quickly. Raw processing power increases at exponential rhythms reaching consumers with minimal delay, as desktop computers and console devices considered state-of-the-art five years ago are being outstripped by today's pocket-sized mobile devices. However, most current human-computer interfaces continue to constitute an exasperating bottleneck for human expressiveness, lacking context- awareness, and the richness and nuances of non-verbal communication. In this lecture we will first overview the history of HCI from a creative and artistic perspective, from the 1960s until our days, with a special focus on music and on BCIs and other physiological interfaces that may help complementing explicit behaviour with implicit information such as mental and physiological states of the human body. We will then slow down at this last decade, when a first generation of products for capturing body movement and brain-state (from the Nintendo Wii-mote, to more recent low-cost BCIs and controllers such as Leap Motion or MYO) have entered the consumer marketplace, proving a thirst for multimodal expressive interfaces, and a clear desire amongst end users to interact with creative multimedia systems in seamless ways. However, many may argue that the experience for many users is still frustrating. We will thus conclude by exploring why “natural interaction” has not yet met our expectations and what kind of technologies may be needed for the next generation of multimodal interactive and expressive interfaces. Presented at the following Conference: PhyCS, 2nd International Conference on Physiological Computing Systems Conference Website: phycs.org
Title: Physiological Computing to Bridge the Gaps Keynote Lecturer: Juan-Manuel Belda-Lois Presented on: 28/07/2017, Madrid, Spain Abstract: Physiological Computing aims at guessing the “internal” state of an individual from their physiological signals and act according to this internal state in order to improve his quality of life, in general terms, or at, least this user experience. For this reason, this set of methodologies and technologies have an important potential in individuals that experiences difficulties to reach the environment from their “internal” needs. Subsequently, Physiological Computing is already, improving their life by bridging the gap from their inner to the environment. One of the most obvious examples of this potential is the use of BCI and BNCI approaches to communicate by people with Locked-In syndrome or Dyskinetic Cerebral Palsy. The later group, the people with dyskinetic cerebral palsy, is specially interesting for these approaches, because, despite most of them borne with conserved intellectual capacities, the lack of interaction with the environment and the lack of communication with third persons impedes the development of their intellectual potentials. Besides, their life expectancy is increasing as long as more tools exist to improve their communication. Communication, besides the sharing of the ideas to third parties, also includes sharing our internal state. This has been known for years as non-verbal communication or emotional communication. When this part of the communication fails we, the communication is felt as “unnatural”. Therefore, the emotional content of the communication should be an important part when we develop alternative communication systems. Providing, the users of this kind of interfaces the possibility to express their internal state, could have the extra benefit of improving their capacity to manage their own emotions, bridging also the gap for communicating in a more natural way. Event's Website: http://www.phycs.org/ Presented at the following Events: PhyCS, 4th International Conference on Physiological Computing Systems
Biofeedback videogames are physiologically driven games that offer opportunities to individually improve emotional self-regulation and produce mental and physical health benefits. To investigate the feasibility of a novel collaborative multiplayer methodology, we created Space Connection, a videogame to promote empathy in teenagers. Space Connection depicts a futuristic adventure aboard a spaceship in which players have to jointly use their powers to solve a set of physics-based puzzles. The game relies on the use of physiological self-regulation to activate the playing partner powers. Using a low-cost brain computer interface and a respiration rate sensor we provided players with two game powers, namely telekinesis and time-manipulation which are mapped to changes in attention and relaxation. In this paper we describe the game mechanics in three different scenarios: i) the cryogenic room, ii) the space ship corridor and iii) the cargo hold. Finally, we performed a playtest session with 10 users (aged 22.2 ± 5.6) to evaluate the game experience. Results revealed high scores in enjoyment and empathy but low scores on interface control. Our findings contribute towards the body of evidence supporting the use of novel biofeedback strategies combined with videogames to promote positive emotions and incentive collaboration and teamwork.
Dr. Larry Berkelhammer was a guest lecturer in Dr. Erik Peper's Health and Social Sciences course at SFSU. This presentation concentrates on documented medical reports of profound physiological changes that occurred from mental stimulus. Get a FREE Preview of "In Your Own Hands..." http://www.larryberkelhammer.com/free-preview The purpose of these videos and my website is to provide evidence-based information on how to live with self-care mastery. It is for all medical patients, caregivers, and advocates who want to learn how to collaborate with physicians to optimize the efficacy of your medical care.
SlideShare: http://www.slideshare.net/AugmentedWorldExpo/mark-billinghurst-university-of-south-australia-the-coming-age-of-empathic-computing Unlike other technologies, AR and VR has the potential to enable people to experience what someone else is seeing, hearing and feeling. This talk describes the coming age of Empathic Computing and how AR and VR technology can be combined with wearable physiological sensors to create shared empathic experiences. Examples will be shown from current research projects from leading groups from around the world, and important directions for future work will be presented. Augmented World Expo (AWE) is back for its seventh year in our largest conference and expo featuring technologies giving us superpowers: augmented reality (AR), virtual reality (VR) and wearable tech. Join over 4,000 attendees from all over the world including a mix of CEOs, CTOs, designers, developers, creative agencies, futurists, analysts, investors, and top press in a fantastic opportunity to learn, inspire, partner, and experience first hand the most exciting industry of our times. See more at http://AugmentedWorldExpo.com
Professor Rosalind Picard explains her work in using technology to decode and respond to emotion, and how it relates to people with autism. Buy the DVD, book & study materials at http://www.testoffaith.com
Microsoft has innovated continually in developing novel interaction modalities, or natural user interfaces. Surface and Project Natal are two examples. While these modalities rely on sensors and devices situated in the environment, we believe there is a need for new modalities that enhance the mobile experience. We take advantage of sensing technologies that enable us to decode the signals generated by the body. We will demo muscle-computer interfaces, electromyography-based armbands that sense muscular activation directly to infer finger gestures on surfaces and in free space, and bio-acoustic interfaces, mechanical sensors on the body that enable us to turn the entire body into a tap-based input device.