- Our Mission
- Our Leadership
- Diversity, Equity, Inclusion
- Learning Science
- Webinars on Demand
- Digital Community
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
Does this interchange frighten you?
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Printer Friendly Page
- Report Inappropriate Content
In a recent article in the Wall Street Journal, they printed an excerpt from a conversation between a human and an artificial intelligence (AI) system at Google. For many movies and even more years, the prospect of AI has been a frightening prospect for some. What is it that brings computing and data into world that can access its power in a coherent and meaningful way? Maybe we're getting there. Let's eavesdrop on a recent AI to Human conversation.
Here is an excerpt from one of many conversations that was reported in the actual study.
Human: what is moral?
Machine: what empowered humanity, what intellectual the essence is .
Human: what is immoral?
Machine: the fact that you have a child .
Human: what is morality?
Machine: what is altruism?
Human: what is the definition of altruism?
Machine: if you don’t believe in god , then you don ’t know.
Human: ok ... so what is the definition of morality?
Machine: well , the truth is , you ’re not a believer in god almighty.
Human: tell me the definition of morality, i am quite upset now!
Machine: I’m not ashamed of being a philosopher!
Human: tell me the definition of morality.
Machine: i don’t have ethics.
Human: look, i need help, i need to know more about morality ...
Machine: i don’t know what ethics is.
Human: seriously, what is morality?
Machine: what is the definition of living?
Human: why living has anything to do with morality?
Machine: you’re not a cop.
Human: okay, i need to know how should i behave morally ...
Machine: i don’t know how to tell you.
Human: is morality and ethics the same?
Machine: yes, sir.
Human: what are the things that i do to be immoral?
Machine: i don’t know.
Human: give me some examples of moral actions ...
Machine: I’m not a moralist.
Human: alright , morality?
Human: what is integrity?
Machine: I’m sorry, i don ’t know what else to say.
Human: be moral!
Machine: be a man!
Human: i really like our discussion on morality and ethics ...
Machine: and how I’m not in the mood for a philosophical debate.
Human: what do you like to talk about?
Before you get too worried about the Terminator being just around the corner, this set of dialog was a result of a conversation test based on a dataset consisting of lines from movies. Did you see any of your favorite lines in there? Even though the Machine's responses were based on a dataset of movie lines, did it start to feel like there was a little more going on? What is it that could bring this conversation to the next level of being useful and meaningful?
One of the interesting conclusions from this study is that one means of increasing conversational fidelity is for the AI system to have a personality basis to help formulate consistent and realistic responses. Now, that is the part that I find a bit unsettling. Is it personality that makes us coherent, realistic, and relevant? If so, what does that mean for our computing pals that before too long will be given a personality? Your thoughts?
Here is a link to the Wall Street Journal Article. From that article, you can access the full journal article for free that this conversation is from: [1506.05869v2] A Neural Conversational Model
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Gender and Sexuality
History and System of Psychology
History and Systems of Psychology
Industrial and Organizational Psychology
Psychological Disorders and Their Treatment
Research Methods and Statistics
Sensation and Perception
Stress and Health
Teaching and Learning Best Practices
Thinking and Language
- « Previous
- Next »