Robopsychologie
JKU - MA Psychologie
JKU - MA Psychologie
Kartei Details
Karten | 111 |
---|---|
Sprache | English |
Kategorie | Technik |
Stufe | Universität |
Erstellt / Aktualisiert | 21.06.2020 / 25.10.2020 |
Weblink |
https://card2brain.ch/box/20200621_robopsychologie
|
Einbinden |
<iframe src="https://card2brain.ch/box/20200621_robopsychologie/embed" width="780" height="150" scrolling="no" frameborder="0"></iframe>
|
EU Definition of Artificial Intelligence
sense - think (plan) - act
Machine Learning
- (Deep Learning, Reinforced Learning)
- Reasoning - information processiong (Search, Planning, Knowledge) - Decision making
Research fields of diferent at Pichelrs institute:
- Object Detection
- Sematic Scene Segmentation
- Human Pose Detection
- Deep Reinforcement Learning
- Object Detection (3D needed for Robotics, Video mit Hund und Rad)
- Sematic Scene Segmentation (autonomous driving, what is the scene about? Meaning)
- Human Pose Detection (Video mit Sportlern, Tänzern) Auch hier 2D schon relativ gut, 3D noch nicht
- Deep Reinforcement Learning (Robots learning to move)
- Online Training (Robot learns objects, Robot dealing with unknown situations or objects)
- Learning Robot Grasping Policies (Greifarme, Objekte greifen)
- Imitation learning (Video mit Glas füllen)
- Deep Reinforcement learning (sorting out a bin, make space to grab and place things)
What is trust?
What is a trustfull person?
Cambridge Dictionary:
Trust is to believe that someone is good and honest and will not harm you, or that something is safe and reliable
Trustful person:
- Consistency and reliabilty (meet the expectations, avoid surprises and risks
- Adequacy and adaptability
- execution (always competent and professional)
- Honesty and openness (communicate inform and explain; point to room for imporovement)
Trustworthy Robots: Safety, Creditibility and Explainability
EU guidelines for Turstworthy AI published april 8, 2019: What does this document?
- it talks about requirements of trustworthy Ai
- Technical methods when realising explainable AI
- Also if you have a system how to assess that it is trustworthy
Framework for Trustworthy AI
Introduction (3)
Chapter 1 (2)
Chapter 2
Chapter 3
Introduction: Lawfull Ai, Ethical Ai, Robust AI
Chapter 1: Foundations of Trustworthy AI -> 4 Ethical Principles (Respect for human autonomy, Prevention of harm, frames, Explicability)
Chapter 2: Realisation or Trustworthy AI -> 7 key Requierements (Technical, non technical methods)
1. Human agency and oversight
2. Technical robustness and safety
3. Privacy and data governance
4. Transparancy
5. Diversity, non-discrimination and fairness
6. Societal and environmental wellbeing
7. accountability
Chapter 3: Assesment of Trustworthy AI -> Trustworthy AI Assesement List
7 key requierements of trustworthy Ai
Again: 7 Keyrequirements of Trustworthy AI:
- Human agency and oversight
- Technical Robustness and safety
- Privacy and Data governance
- Transparency
- Diversity, non-discrimination and fairness
- Societal and environmental wellbeing
- Accountability (Auditability and accounting)
Technical Methods for Trustworthy AI (5)
- Archticures for Trustworthy Ai
- Ethics and rule of law by design
- Explanation methods (X to AI)
- Testing and validating
- Quality of Service Indicators
Difference between Turst and Credibility?
Trust comes from the heart (Firm belief in the reliability, truth or ability of someone or smthg
Credibility needs some sort of justification or proor (head), able to believed in, in justifying confidence.
A credible robot system adheres to guidelines and standards from the technical perspective while at the same time being constructed with the effect on iths users mind.
We aim to provide credibility guidelines and technical architectures that, when followed, give a robot system "certified" trustworthiness (which is similar to the current robotic safety approach)
What is CredRoS?
CredRoS - Credible and Safe Robot Systems
- Sensitive manipulation and robbot safety
- Dynamic detection of the environment
- Sensory perception of the human being
- Multimodal Human-Robot Interaction
- Task planning and task execution
- Demnonstration
- Explore
- Develop and integrate for demonstration
- contribute
Here you see different points of view. Horizontal achsis is the timeline, which is the present with different steps. Different points of view and different times.
First thing is to sense smthg via sensor and input data. This is interpreted somehow according to environment and context. Also according to the history.
Interact or React is what the robot can immidiately do
Also it can plan smthg for the future
Next step is to preserve (speichern)
________
Situation Context (immediate)
Reflex Context (nearly immediate, given the context, the robot should be able to foresee smthg)
Safety Context
Tacticle Perception
Somatosensation can be split up into 3 sensory systems.
- Hapsis or Touch
- Nociception (Temperature and Pain)
- Proprioception (Body Awareness)
What is Proprioception?
Bodyawareness
- Sensory information from muscles, tendons (Sehnen) and ligaments (Bänder)
- Bodily position and awareness
Proprioceptive activities are:
- Jumping on a Trampoline
- Climbing a rockwall
- pulling a heavy wagon
- monkey bars
Explain Proprioception
The Brain, vestibular organs, eyes usw.
The brain receives and interprets information from multiple inputs:
Vestibular organs: in the inner ear send information about rotation, acceleration (Beschleunigung) and position
Eyes send visual information
Stretch receptors in skin, muscels and joints (Gelenke) send information about the position of body parts
Proprioception Task
- Activity: Find your fingertips
- Close your raise both hands above your head keep the fingers of your left hand totally still. With your right hand quickly touch your index fingertip to your nose then quickly touch the tip of your thumb of your left hand with the tip of your right indexfinger. Quickly repeat the entire process while attempting to touch each fingertip (always return to your nose in between fingertip attempts).
Try again, but this time, wiggle the fingers of your raised hand while you're doing this.
Switch hands and try again. How successfully did you locate each fingertip? Did you improve with time? Was there a difference when you used your right versus your left hand?
Tactile Perception:
Phantom Limb (Gliedmassen) Sensations
If a limb is lost through an accident or amputation – around 60% of patients experience phantom limb sensations
The patient feels the presence of their lost limbs
Why?
This happens as neurons in the somatosensory cortex that received input from the sensory receptors of the amutated limb - receive input from the neighbouring regions of the body
This leads patient to teel that the limb is still there
Phantom Limb Pain
Amputees often have to deal with phantom limb pain
Explanation by Neuroscientist Vilayanur Ramachandran:
If you pick up a glass = your brain sends signals down your arm to move towards the glass and pick it up
If your arm is not there = no movement happens
Thus your brain keeps sending "move" signals
Phantom Limb Pain
Can visual feedback help this never ending signals?
Mirror Therapy is one way to help with phantom limb pain
The rubber hand illusion can help to deal with this pain.
Tactile Perception: Touch & Proprioception
Similar to the Mirror Therapy for phantom limb pain – visual feedback can also be given via Virtual Reality to treat phantom limb pain
how?
Using VR Games that require the patients to move their limbs
the amputated limb‘s image is filled in, in the VR Game = so they receive visual feedbackExample: https://www.youtube.com/watch?v=VI621YPv9C0
Robot Proprioception
Again, what is Proprioception?
For what tasks is proprioception important for Robots?
Proprioception = bodily awareness
Robot proprioception is important for a variety of tasks:
- To avoid interference with objects / humans
- Avoid harming itself, others
- Bheave autonomously
name two examples of Body aware robots:
Soter et al., 2018:
Kwiatkowski & Lipson, 2019:
Soter et al., 2018:
- Bodily aware soft robots: Integration of proprioceptive and exteroceptive sensors
- stimulated soft robot can learn to imagine its motion even when its visual sensor is not available
Kwiatkowski & Lipson, 2019:
- A robot modeled itself autonomously
- The robot had no prior knowledge about its shape
- The self-model was then used to perform tasks & detect self-damage
- Where are proprioceptive sensor needed in autonomous driving?
- Inertial Measurement Units (Koordinationssystem)
- measures values internally to the system (robot / autonomous car)
- e.g. motor speed, wheel load, direction, battery status
Inertial Measurement Units
- IMU are employed for monitoring the velocity and position changes
- tachometers are utlized for measuring speed and altimeters for altitude (Höhenlage)
Human Touch
Human Somatosensory System:
Name 9
- hot
- cold
- pain
- pressure
- tickle
- itch
- vibrations
- smooth
- rough
Artificial Skin
Name one austrian company
Airskin - blue danube robotics: developed in Austria
- Industrial Robot with touch-sensitive skin
- Robot and gripper are fully covered with soft Airskin pads, including clamping (befestigen) shearing areas
In the event of a collision between the robot and an employee or an object, the collision sensor responds and instantly triggers an emergency stop.
Columbia engineers have developed a ‘tactile robot finger with no blind spots’ (February 26, 2020)
What is special about it?
- Their finger can logalize touch with very high percision - less than 1mm - over a large, multicurved surface, much like its human counterpart
- Integrating the system onto a hand is easy: Thanks to this new technology the finger collects almost 1000 signals but only needs a 14 wire cable connecting it to the hand and it needs no complex off-board electronics.
• A capacitive pressure sensor system with 108 sensitive zones for the hands of the humanoid robot iCub was designed (Schmitz et al., 2010).
• The results show that the sensor can be used to determine where and (although to a lesser extent) how much pressure is applied to the sensor.
iCub's Hand
What do you know?
- the palm has 48 taxels
- each of five fingertips has 12 taxels
- fingertip has a shape similar to human fingertip
- the fingertip is small
- the fingertip provides 12 pressure measurements
- and is intrinsically compliant
Artificial Touch
A prosthetic hand developed by SynTouch is:
- equipped with human-like fingernails and fingerprints
- able to use contact detection
- able to adjust force
- mimicking sensation of vibrations, textures and temperatures
- the hand and reflexes connect on their own
Users have an unprecedented ability to pick up objects without having to actively think about the amount of force they are applying - just like te reflexes of a real human hand
One of the most advanced Robot Hands..
... created by the Shadow Robot Company and includes sensors from SynTouch
is a....
It is a Tactile Telerobot that can fuse your own hand movements with the robot hands
It can be used for a variety of tasks and gives realistic touch feedback
What happens if people implant technical bodyparts? (Cyborgs = cybernetic organism)
A person/being with organic and biomechatronic body parts
3 criteria:
- Communication between body and brain must be intact
- Technical aspect has to be melted with the body so it becomes a body part
- The additional body part has to improve the human's senses / capabilites
Neil Harbisson
Who is he?
- part of the Cyborg community
- he has an antenna implanted which helps him to perceive vicible and invisible colours via audible vibrations in his skull
- these colours include infrareds and ultraviolets
- he can also receive colours from space, images, videos music or phone calls directly into his head via internet connection
Who is Moon Ribas?
- She developed the Seismic Sense: an online seismic sensor once implanted in her feet that allowed her to perceive earthquakes taking place anywhere in the planet through vibrations in real time.
Ribas’ seismic sense also allowed her to feel moonquakes, the seismic activity on the Moon
Ribas believes that by extending our senses to perceive outside the planet, we can all become senstronauts. Adding this new sense allowed her to be physically on Earth while her feet felt the Moon.
Auditive Perception
The Ear Anatomy - Summary
Pinna
Auditory Canal
Eardrum
Ossicles
Cochlea
Pinna: Collects and amplifies the sound waves
Auditory Canal: The soundwaves from the pinna are deflected into the auditory canal
Eardrum: vibrates and sends the vibration onto the ossicles
Ossicles (Hammer, Anvil (Amboss), Stirrup (Steigbügel))
Cochlea: Snail shaped: Contains lympathic fluid within this fluid is the basilar membran which has hair cell attached to it
Auditory Illusions
What makes human hearing special?
Cocktail Party Effect:
◦ The brain is able to focus the auditory attention on a particular stimulus
◦ Simultaneously, it is filtering our other auditory stimuli
◦ An example for this: hearing your own name in a noisy room
Historically, systems struggle to filter out background noise to recognise the “main speakers“ voice.
Hiw did Google Research tackled this problem?
They trained "multi-stream convolutional neural network“
When trained the system is now capable of focusing on a single voice and filtering out everything else.
This means that a trained system might be even better at filtering out noises compared to a human...
What makes human communication special?
Understanding prosody (Verslehre), intonation (Betonung) and emotions
What is subglottal pressure?
What is prosodic modulation?
Physiological variations in an emotionally aroused speaker can cause an increase of the subglottal pressure (the pressure generated by the lungs beneath the larynx), which can affect voice amplitude and frequency = expression of the speaker’s emotional state
Expression of emotions through prosodic modulation of the voice, in combination with other communication channels, is crucial for affective and attentional regulation in social interactions in adults and infants (Sander et al., 2005; Schore and Schore, 2008).
For emotional communication: prosody can prime or guide the perception of the semantic meaning (Ishii et al., 2003; Pell et al., 2011; Newen et al., 2015; Filippi et al., 2016).
Human speakers can put words and their meaning into context.
Humans assess gestures and facial expressions simultaneously with spoken words
Problems related to AI voice assistants
In comparison to humans, AI assistants have difficulties.....
Difficulties in understanding prosody, intentions and humour from spoken language.
- This can cause misinterpretations of spoken language Missy Cummings, an associate professor at MIT said:
“You could do all the machine learning in the world on the spoken word, but sarcasm is often in tone and not in word,” she added. “[Or] facial expressions. Sarcasm has a lot of nonverbal cues.”
New AI voice assistant can understand intonations
Which one?
A new AI assistant, OTO is specialising on this and can be used to assist in real-time conversational data:
“OTO is moving beyond speech-to-text, pioneering the first multi-dimension conversational system with the ability to merge both words + intonation (Acoustic Language Processing). This provides a much richer understanding of the context, sentiment and behaviors of a conversation.“
Recent research has focussed on detecting depression from a person‘s speech pattern.
How?
AI algorithms can now more accurately detect depressed mood using the sound of a person‘s voice, according to new research at the University of Alberta
An App could collect voice samples from people over longer time periods when they speak naturally
Over time, the App would track indicators of mood
Voice Cloning - Voice cloning is easily accessible today:
Montreal-based AI startup Lyrebird provides an online platform that can mimic a person’s mimics speech when trained on 30 or more recordings
Baidu introduced a new neural voice cloning system that synthesizes a person’s voice from only a few audio samples
New Github project (Sept., 2019): Users enter a short voice sample and the model — trained only during playback time — can immediately deliver text-to-speech utterances in the style of the sampled voice