Robopsychologie

JKU - MA Psychologie

JKU - MA Psychologie


Fichier Détails

Cartes-fiches 111
Langue English
Catégorie Technique
Niveau Université
Crée / Actualisé 21.06.2020 / 25.10.2020
Lien de web
https://card2brain.ch/box/20200621_robopsychologie
Intégrer
<iframe src="https://card2brain.ch/box/20200621_robopsychologie/embed" width="780" height="150" scrolling="no" frameborder="0"></iframe>

Robot Touch: ”Instrumental” vs “affective”

 

Chen et al., 2011:

  • Between-subjects experiment with 56 people in which a “robotic nurse” touched their forearm.

  • Independent variables: a) whether or not the robot verbally warned the person before contact, and b) whether the robot verbally indicated that the touch was intended to clean the person’s skin (instrumental touch) or to provide comfort (affective touch).

Results

  • People responded significantly more favorably to the instrumental touch than to the supposedly affective touch.

More knowledge = more balanced perception

  • The smaller the knowledge about algorithms, the more likely people are to only think of risks.

  • The greater the knowledge about algorithms, the more likely people are to perceive both

    Many people don’t feel well-informed about AI opportunities and risks

  • According to a 2017 survey, nearly half of Americans reported that they were unfamiliar with AI (Morning Consult, 2017).

  • Also in 2017, only 9% of the British public said they had heard of the term “machine learning” (Ipsos MORI, 2018).

  • This is a mission for us to educate people about AI!

What are negative expectations about effects of AI?

 

 

 

Constant fear-mongering can make it harder for the human brain to correctly distinguish between a legitimate and a false threat.

Differences in the support of AI development

Other findings from Zhang & Dafoe, 2019:
Demographic characteristics account for substantial variation in support for developing AI:

  • Substantially more support for developing AI is expressed by college graduates (57%) than those with high school or less education (29%);

  • by those with larger reported household incomes, such as those earning over $100,000 annually (59%), than those earning less than $30,000 (33%);

  • by those with computer science or programming experience (58%) than those without (31%);

  • by men (47%) than women (35%).

Fear of technology: Spread in the population

  • 37 percent of participants in representative US survey fit the definition of a "technophobe“, (defined as someone who is either afraid or very afraid of automation) (Liang & Lee, 2017; McClure, 2018).

  • Empirical studies on computer anxiety in the 1990s and 2000s:

  • Prevalence in the population between 20 % and up to approx. 50% was found in various samples (Chua et al., 1999; Powell 2013).

 

What is Computer Anxiety?

Computer anxiety is defined as ‘‘the tendency of individuals to be uneasy, apprehensive, or fearful about current or future use of computers’(Parasuraman & Igbaria, 1990)

- Hundreds of scientific articles about the antecedents, correlates andeffects of computer anxiety have been published in the 1990s and 2000s.

What is self-efficacy?

= a person‘s belief in his/her ability to succeed in a specific situation or accomplish a task. One's sense of self- efficacy can play a major role in how one approaches goals, tasks, and challenges.

Computer anxiety in relation to personal characteristics

CA and gender:

CA and age:

CA and other anxieties:

CA and gender:

Eighty articles in the 1990s and 2000s looked at gender and its effect on CA. Results were split between articles that found no difference in CA between gender and articles that found females to have more CA than males. Thirty-five articles found no difference in CA between genders (Powell, 2013).

CA and age:

Similar to the research on gender and CA, results were evenly split between articles that found age to be positively related to CA (n = 16) and articles that found no relationship between age and CA (n = 16) (Powell, 2013).

CA and other anxieties:

 

  • CA was found to positively correlate with depression (Lankford, Bell, & Elias, 1994) and with Internet (Thatcher, Loughry, Lim, & McKnight, 2007).

  • Math anxiety was positively related to CA for women but not men (Parasuraman & Igbaria, 1990).

Computer anxiety in relation to education

CA and education level:

CA and educational background:

CA and education level: 

  • Education level was found to be negatively correlated with CA; the more education a person has, the lower his/her CA is in general (e.g., Chou & Tsai, 2009; Harris, 1999).

  • Graduate students have less CA than undergraduate students (Bozionelos, 2001; Fitzgerald et al., 1997).

CA and educational background:

  • Students with computer science majors were found to have less CA than education majors (Williams & Johnson, 1990).

  • IT majors had lower CA than psychology majors (Todman, 2000).

  • Science/technology majors had lower CA than humanities/social sciences majors (Chou, 2003).

 

 

The role of personality traits

How people's personalities differ can be related to how anxious they are about new technology.

Personality traits that may play a role include:

  • Openness: A person’s willingness to try new things, think outside the box

  • -> negatively correlated with tech anxiety

  • Neuroticism: A person’s (low) emotional stability, (low) confidence, pessimism

    -> positively correlated with tech anxiety

    These two personality traits belong to the so-called ”Big Five”, a personality model that describes five major dimensions of personality and that is commonly used in research and practice (aka OCEAN model).

Cultural differences in technology acceptance?

What does the empirical status quo say?

- So far results are slightly inconclusive: there are many studies that show support for the assumed differences, but also some that do not or in other ways than expected.

E.g.:

  • People in the UK were found to have more negative attitudes towards the interaction with robots than Japanese people; elder people with a weaker perceived relation to their family members had stronger negative attitudes towards social robots in both nations (Nomura, 2014).

  • People in the UK reported to feel more negative about a humanoid robot than did people in Japan (Nomura, Syrdal, Dautenhahn, 2015).

  • US participants were found to be least negative towards robots, while Mexican participants were most negative (Bartneck et al., 2005).

  • People in Australia perceived a physically present android robot more positive than Japanese participants (Haring et al., 2014).

What is acceptance?
(Dillon & Morris, 1996; Kaan, 2017):

User acceptance in the context of information technology is defined as the demonstrable willingness within a user group to employ a technology for the tasks it is designed to support.

(Cho et al., 2017):

User acceptance is defined as an intention to use willingly information technology that is embodied to support users.

In other words, an individual’s intent to use a certain technology = user acceptance.

Relation between intention and actual usage: The Theory of Planned Behavior (Ajzen, 1991)

What shapes behavioral intentions and actual behavior?

  • Attitude (I think Teslas are cool)
  • perceived subjective norms (my peers think Teslas are cool)
  • perceived behavioural control (I could handle a Tesla)

  • Background of TAM: Grown out of research on the adoption of (management) IT services at work

  • Originally developed in the 1980s, but tested, expanded and refined over three decades

  • TAM focuses on the prediction of technology adoption by new users

  • TAM posits that the individual adoption and use of tech is determined by

    1) Perceived usefulness and 2) Perceived ease of use

TAM 3


What changed to TAM?

  • There are many further developments of the original TAM: Unified Theory of Acceptance and Use of Technology, Autonomous Vehicle Acceptance Model

  • TAM 3 ist the most comprehensive version of TAM to date

  • Able to explain between 52% and 67% of the variance of perceived usefullness

Determinants of Perceived Usefulness in TAM 3

(6)

1. Subjective Norm (The degree to which an individual perceives that most people who are important to him/her think he/she should or should not use the system (Fishbein & Ajzen, 1975; Venkatesh & Davis, 2000).

2. Perceived Ease of Use (The degree to which a person believes that using an IT will be free of effort)

3. System Image (The degree to which an individual perceives that use of an innovation will enhance his or her status in his or her social system (Moore & Benbasat, 1991).

4. Job Relevance (The degree to which an individual believes that the target system is applicable to his or her job (Venkatesh & Davis, 2000).

5. Output Quality (The degree to which an individual believes that the system performs his or her job tasks well (Venkatesh & Davis, 2000).

6. Result Demonstrability (The degree to which an individual believes that the results of using a system are tangible, observable, and communicable (Moore & Benbasat, 1991).

TAM:
Strong Focus on ”pragmatic” qualities of a system

Usability?

Usability is the degree of ease with which products such as software and Web applications can be

used to achieve required goals effectively and efficiently.
Usability assesses the level of difficulty involved in using a user interface. “User-friendly technology“

But is it the whole story?

E.g. “ease of use" has been identified as particularly important for long-term adoption of technology. For initial attitudes and decisions, factors like perceived „enjoyment“ are important as well.

Self-Determination Theory (Ryan & Deci, 2000)

  • Important psychological theory of human motivation

  • Motivation = what causes you to act; the process that initiates, guides, and maintains goal-oriented behaviors

Central assumptions: (controlled and autnonomous motivation)

STD differentiates between autonomous motivation and controlled motivation: Autnonomous motivation = Doing something because you feel a full sense of willingness, choice, interest, enjoyment, value (intrinsic)

Controlled motivation = Doing smthg because you are demanded to do it, to get some reward or to avoid punishment (extrinsic)

Self-Determination Theory (Ryan & Deci, 2000)

Central assumptions:

  • All human beings have a limited set of basic psychological needs.

  • Their satisfaction is essential for well-being, flourishing and optimal performance.

  • If they don’t get satisfied -> negative consequences.

  • Their satisfaction leads to more autonomous forms of motivation.

  • There are three BPN that are discussed as particularly important:

    o Competence
    o Autonomy
    o Relatedness

1) COMPETENCE as basic psychological need

2) AUTONOMY as basic psychological need

3) RELATEDNESS as basic psychological need

1) COMPETENCE as basic psychological need

  • Competence concerns the experience of effectiveness and mastery, to feel confident in relation to whatever you doing
  • It becomes satisfied as one cabably engages in activities and experiences oppurtunities for using and extending skills and expertise
  • When frustrated, one experiences a sense of ineffectiveness or helplessness

2) AUTONOMY as basic psychological need

  • refers to the experience of volition and willingness
  • when satisfied one experiences a sense of integrity as when one's actions, decisions, thoughts, and feelings are self-endorsed and authentic
  • When frustrated, one experiences a sense of pressure and often conflict such as feeling pushed in an unwanted direction

3) RELATEDNESS as basic psychological need

  • denotes the experience of warmth, bonding, and care, to be cared for by others, to care for others, to feel like you belong in groups that are important to you
  • It is satisfied by connecting to and feeling significant to others
  • relatedness frustration comes with a sense of social alienation, exclusion and loneliness

 

Conclusion regarding competence, aunonomy and relatedness?

For the acceptance of AI applications in society, utilitarian product attributes such asusefullness, ease of use and expected output quality are important - but basic psychological needs such as autonomy, competence and relatedness or hedonic needs such as stimulation and enjoyment should not be ignored

  • How does Jentsch describe uncanny feelings? 
  • How did Freud emphasize it?

  • Jentsch: as “intellectual uncertainty” and not being “at home” (un-heimlich) in the situation concerned
  • Freud: - In contrast to Jentsch, Freud emphasized that what is uncanny is something that seems to be “un-homely” (unheimlich) and unfamiliar, but at the same time “homely” and familiar (“the unfamiliar in the familiar”).
  • In Freud’s view, the uncanny might be anything we experience in adulthood that reminds us of early psychological stages or of primitive experiences.

How do we (after Mori) in the uncanny valley perceive very high to perfect human like machines?

  • Not perceived as uncanny, because not distinguishable from real humans anymore

    Examples: Lifelike anroid robots, social bots, lifelike synthetic voices, AI-generated portraits of (real or fake) persons, deep fake videos

    Ethical question: Do we want to live in a world where humans and machines are impossible to distinguish?

    EU AI Ethics Guidelines say: Machines must be identifiable as such

Where is Sofia in the uncanney valley?

High but not perfect level of human-likeness:

- Perceived as uncanny/threatening

- Non-perfect android robots, non-perfect computer- animated faces and avatars, synthetic voices, AI-generated portraits/videos with small glitches

  • Relationship animal likeness & likeability = U-shaped function & UV effect found

When where robots prefered? Animallike? Not animallike?

  • Robots were preferred when they looked very animal-like or not animal-like at all as compared to robots who mixed realistic and unrealistic animal-like features

Recent neuroscientific results on uncanny valley

Across two experimental tasks, the ventromedial prefrontal cortex (VMPFC) encoded an explicit representation of participants’ uncanny reactions.

 

Which brain areas where active?

The ventromedial prefrontal cortex
it signaled the subjective likeability of artifical agents on a nonlinear function of human-likeness, with selective low likeability for highly humanlike agents

The same brain areas were active when participants made decisions about wheter to accept a gift from a robot. One further region - the amygdala, which is responsible for emotional responses - was particulary active when participants refected gifts from the human-like, but not human artificial agents.

Why are highly but not perfectly humanlike artificial figures creepy?

Evolutionary approaches:

Categorical uncertainty / perceptual mismatches:

Expectancy violation / Prediction errors:

Evolutionary approaches:

Pathogen avoidance, mate selection, also mortality salience was mentioned earlier (e.g. Ho, MacDorman, & Pramono, 2008; MacDorman & Ishiguro, 2006)

Categorical uncertainty / perceptual mismatches:

Not clear to which category belongs (human? machine? hybrid?) (e.g. Jentsch, 1906; Gray & Wegner, 2012)

Expectancy violation / Prediction errors:

If people evaluate the robot’s behavior according to a human schema, it might not measure up to these expectations due to ist imperfections
(MacDorman, 2006; Matsui, Minato, MacDorman, & Ishiguro, 2005, 2006; Mitchell, Szerszen, Lu, Schermerhorn, Scheutz, & MacDorman, 2011; Saygin, Chaminade, Ishiguro, Driver, & Frith, 2012; Steckenfinger & Ghazanfar, 2009)

Developmental influences: No Uncanny valley for kids?

Children below the age of 9 didnt rate the humanlike robot as creepier in comparison to the machine like robot. This suggests a developmental effect for the uncanny valley

Interplay between trustor, trustee & situation.

Trustors's propensity (Tendenz) to trust:

  • Propensity to trust is regarded as a stable individual trait that refers to the general tendency for someone to trust other individuals.

  • Propensity to trust has a global effect on trust intentions (Colquitt et al., 2007) and trustworthiness assessments (Jones & Shah, 2016).

  • However, the impact of trust propensity is most salient early in interpersonal interactions, when other information may not yet be available (McKnight, Cummings, & Chervany, 1998).

  • Once other information becomes more salient, such as the trustee’s previous behaviors, propensity to trust will have a weaker influence on the extent to which the trustor will make him/herself vulnerable to the trustee (Mayer et al., 1995).

Trustee‘s perceived trustworthiness

  • Trustworthiness is the trustor’s perception of the trustee (Mayer & Davis, 1999).

  • Perceptions are formed as a trustor interprets and ascribes motives to the trustees’ actions (Ferrin & Dirks, 2003). Thus, perceptions of trustworthiness, although inherently within the trustor, are a function of the interaction of trustor and trustee as the trustor is processing information about the trustee. It is important to note these are the ascribed beliefs of the trustor and are not necessarily factual.

  • As interactions mature, a trustor will increasingly depend on the behavior of the trustee rather than personal dispositional factors, such as propensity to trust, when making trust evaluations (Jones & Shah, 2016; Levin et al., 2006).

Cognitive Trust

  • Cognitive trust describes the willingness to rely on a partner‘s ability/competence and predictability/reliableness (Moorman et al., 2992; Rempel et al., 1985; Johnson-George & Swap, 1982).

  • It arises from an accumulated knowledge that allows one to make predictions, with some level of confidence, regarding the likelihood that a trustee will live up to his/her/its obligations.

  • Cognitive trust is knowledge-driven, the need to trust presumes a state of incomplete knowledge. A state of complete certainty regarding a partner's future actions implies that risk is eliminated and trust is redundant.

Affective Trust

  • Affective trust is the confidence one places in a partner on the basis of feelings generated by the level of benevolence/care and integrity the trustee demonstrates (Johnson-George & Swap, 1982; Rempel et al., 1985).

  • It is characterized by feelings of security and perceived strength of the relationship.

  • Affective trust is decidedly more confined to personal experiences with the focal partner than cognitive trust. As emotional connections deepen, trust in a partner may venture beyond that which is justified by available knowledge. This emotion-driven element of trust makes the relationship less transparent to objective risk assessments.

Importance of interpretable trajectory design in human-robot interaction