BIOSTEC 2018 tutorial: Physiological Computing and Intelligent Adaptation - Stephen Fairclough

BIOSTEC 2018 tutorial: Physiological Computing and Intelligent Adaptation - Stephen Fairclough

Interaction with a physiological computing system represents one approach to the creation of a technology where control is achieved without touch and software responds to the psychological context of the user. The closed-loop logic of these systems describes how raw physiological data from the body and brain is translated into a series of dynamic control inputs and changes at the interface, which are conveyed directly to the user. This process of translation from raw physiology to input control contains a number of steps with significant hurdles, such as: the design of wearable sensors that deliver high quality data in an unobtrusive way, the process of inferring psychological states from physiological data in everyday life, the detection of artifacts, and classification of data in real-time. These challenges of measurement and signal processing in this field are substantial but the design of the adaptive controller is central to the user experience. The adaptive controller represents the rationale of the closed-loop, which describes the way in which data is translated into adaptations and responses at the interface with the user. This component remains relatively unexplored compared to signal processing and classification, but it is the efficacy of the adaptive controller that will largely determine the user experience and the degree of “intelligence” displayed by the system.

"Past and Future of Physiological Computing and Creativity..." Prof. Sergi Jorda (PhyCS 2015)

"Past and Future of Physiological Computing and Creativity..." Prof. Sergi Jorda (PhyCS 2015)

Keynote Title: Past and Future of Physiological Computing and Creativity - An Underexplored and Promising Territory Keynote Lecturer: Sergi Jorda Keynote Chair: Hugo Plácido da Silva Presented on: 13/02/2015, ESEO, Angers, Loire Valley, France Abstract: We humans are highly expressive beings, and not only with respect to language; non-verbal communication has and will forever play an essential role in all human relationships. It is theorized that the human sclera, the "white of the eye", is unique in the animal kingdom in that it is visible whenever the eye is open. It has evolved to be this way because of our social nature, making it easier for one individual to infer where another one is looking, increasing the efficacy of this form of non-verbal communication and turning the eye from a sensory organ into a powerful communication tool. Beyond the eyes, our whole human body is a major source for non-verbal expressiveness, both conscious and unconscious. We humans are also highly creative beings. We have created language, music, art... and technology - and the technology we invent advances very quickly. Raw processing power increases at exponential rhythms reaching consumers with minimal delay, as desktop computers and console devices considered state-of-the-art five years ago are being outstripped by today's pocket-sized mobile devices. However, most current human-computer interfaces continue to constitute an exasperating bottleneck for human expressiveness, lacking context- awareness, and the richness and nuances of non-verbal communication. In this lecture we will first overview the history of HCI from a creative and artistic perspective, from the 1960s until our days, with a special focus on music and on BCIs and other physiological interfaces that may help complementing explicit behaviour with implicit information such as mental and physiological states of the human body. We will then slow down at this last decade, when a first generation of products for capturing body movement and brain-state (from the Nintendo Wii-mote, to more recent low-cost BCIs and controllers such as Leap Motion or MYO) have entered the consumer marketplace, proving a thirst for multimodal expressive interfaces, and a clear desire amongst end users to interact with creative multimedia systems in seamless ways. However, many may argue that the experience for many users is still frustrating. We will thus conclude by exploring why “natural interaction” has not yet met our expectations and what kind of technologies may be needed for the next generation of multimodal interactive and expressive interfaces. Presented at the following Conference: PhyCS, 2nd International Conference on Physiological Computing Systems Conference Website: phycs.org

Physiological Computing

Physiological Computing

Rob Jacob talks with guest editors Giulio Jacucci, Stephen Fairclough, and Erin T. Solovey about how advancements in physiological computing might someday blur the distinction of where our bodies end and our computers begin. From Computer's October 2015 issue: http://www.computer.org/csdl/mags/co/2015/10/index.html.

Biocybernetic loops: applied psychophysiology and human-computer interaction

Biocybernetic loops: applied psychophysiology and human-computer interaction

Psychophysiology is the science of inferring psychological meaning from physiological processes in the body and brain. Signals from the cardiovascular system and changes in brain activity can provide a covert quantification of effort and motivation during cognitive performance. A series of studies are presented to illustrate how the process of energy mobilisation by the body and brain responds to psychological challenge. The same process of inference is fundamental to the development of a new category of technology called physiological computing where signals are used for real-time classification of human cognition, motivation, emotion and health. This approach enables the creation of adaptive technologies that monitor the user and respond to changes in emotion and motivation. The potential benefits and costs of this technology will be discussed with reference to applications for health promotion and implications for personal privacy and identity. Stephen Fairclough obtained his PhD in Psychology from Loughborough University in 2000. As a Professor of Psychophysiology his research is concerned with applied psychophysiology in the context of human performance. His work has included studies of mental workload, emotion, stress and sleepiness and he has a particular interest in motivation and mental effort. Stephen also has an active research interest in human-computer interaction (HCI), particularly physiology-computing paradigms where data from the human body is used as a control input to a technological system. His work has been published in a range of journals spanning psychology and computer science. He is the co-editor of a collected work (Advances in Physiological Computing) to be published in May 2014 and a member of the editorial board for IEEE Transactions on Affective Computing.

Physiological Computing and Intelligent Adaptation - Stephen Fairclough

Physiological Computing and Intelligent Adaptation - Stephen Fairclough

Interaction with a physiological computing system represents one approach to the creation of a technology where control is achieved without touch and software responds to the psychological context of the user. The closed-loop logic of these systems describes how raw physiological data from the body and brain is translated into a series of dynamic control inputs and changes at the interface, which are conveyed directly to the user. This process of translation from raw physiology to input control contains a number of steps with significant hurdles, such as: the design of wearable sensors that deliver high quality data in an unobtrusive way, the process of inferring psychological states from physiological data in everyday life, the detection of artifacts, and classification of data in real-time. These challenges of measurement and signal processing in this field are substantial but the design of the adaptive controller is central to the user experience. The adaptive controller represents the rationale of the closed-loop, which describes the way in which data is translated into adaptations and responses at the interface with the user. This component remains relatively unexplored compared to signal processing and classification, but it is the efficacy of the adaptive controller that will largely determine the user experience and the degree of “intelligence” displayed by the system.

Computing With Emotions | Peter Robinson | A-Talk

Computing With Emotions | Peter Robinson | A-Talk

The importance of emotional expression as part of human communication has been understood since the seventeenth century, and has been explored scientifically since Charles Darwin and others in the nineteenth century. Recent advances in Psychology have greatly improved our understanding of the role of affect in communication, perception, decision-making, attention and memory. At the same time, advances in technology mean that it is becoming possible for machines to sense, analyse and express emotions. We can now consider how these advances relate to each other and how they can be brought together to influence future research in perception, attention, learning, memory, communication, decision-making and other applications. This talk will survey recent advances in theories of emotion and affect, their embodiment in computational systems, the implications for general communications, and broader applications. The combination of new results in psychology with new techniques of computation on new technologies will enable new applications in commerce, education, entertainment, security, therapy and everyday life. However, there are important issues of privacy and personal expression that must also be considered. Prof Peter Robinson Computer Laboratory, Rainbow Research Group, University of Cambridge Peter Robinson is Professor of Computer Technology in the Computer Laboratory at the University of Cambridge, where he leads the Rainbow Research Group working on computer graphics and interaction. Professor Robinson's research concerns problems at the boundary between people and computers. This involves investigating new technologies to enhance communication between computers and their users, and new applications to exploit these technologies. He has been leading work for some years on augmented environments in which everyday objects acquire computational properties through user interfaces based on video projection and digital cameras. Recent work has investigated inference of people's mental states from facial expressions, vocal nuances, body posture and gesture, and other physiological signals, and also considered the expression of emotions by robots and cartoon avatars. The A-Talks are gathering world-renowned experts in fields related to robotics & AI, presenting ground breaking ideas in various domain from robotics and computer science, to psychology, language or social sciences. We want to bring to the audience fresh new ideas, just beyond the level of popular science into the real core of today's top most advanced research in the field. Our ambition is that if you listen to an A-Talk, there is a good chance you'll hear ideas that you've never heard before.

Natural User Interfaces with Physiological Sensing

Natural User Interfaces with Physiological Sensing

Microsoft has innovated continually in developing novel interaction modalities, or natural user interfaces. Surface and Project Natal are two examples. While these modalities rely on sensors and devices situated in the environment, we believe there is a need for new modalities that enhance the mobile experience. We take advantage of sensing technologies that enable us to decode the signals generated by the body. We will demo muscle-computer interfaces, electromyography-based armbands that sense muscular activation directly to infer finger gestures on surfaces and in free space, and bio-acoustic interfaces, mechanical sensors on the body that enable us to turn the entire body into a tap-based input device.

Consciousness & Physiology II

Consciousness & Physiology II

Dr. Tony Nader, MD, PhD (MIT, Harvard) reviews scientifically hard and easy problems surrounding consciousness in biology and cognitive science. He proposes that consciousness is primary and only appears as matter.

Physiological Sensing Interface

Physiological Sensing Interface

Microsoft TechFest demo: Physiological Sensing Interface. March 2010

Medical imaging and computational physiology: The IUPS physiome project

Medical imaging and computational physiology: The IUPS physiome project

The 2007 Hounsfield Lecture: presented by Professor Peter Hunter

Top Videa -  loading... Změnit krajinu
Načíst dalších 10 videí
 
 
Sorry, You can't play this video
00:00/00:00
  •  
  •  
  •  
  •  
  •  
  •  
  •  
CLOSE
CLOSE
CLOSE