I am writing a novel and I need a name for professor of Philosophy. Do you have any suggestions?
ChatGPT
I am writing a novel and I need a name for professor of Philosophy. Do you have any suggestions?
ChatGPT
Certainly! Here are some name suggestions for a professor of Philosophy:
1. Professor Benjamin Stone
2. Professor Evelyn Clarke
3. Professor Theodore Greene
4. Professor Isabella Martinez
5. Professor Malcolm Adams
6. Professor Olivia Wells
7. Professor Gabriel Harrison
8. Professor Sophia Mitchell
9. Professor Jonathan Foster
10. Professor Victoria Reynolds
---
### On the other hand
How about names for a street thug?
Certainly, here are some name suggestions for a street thug character:
1. Tony "Razor" Marino
2. Frankie "Knuckles" Rivera
3. Vinny "Slick" Delgado
4. Johnny "Fingers" Santoro
5. Marco "Snake" Rodriguez
6. Benny "Rat" Moretti
7. Mikey "Two-Times" Colombo
8. Ricky "Brawler" Sanchez
9. Joey "Spike" Roselli
10. Eddie "Chains" Martin
These names have a bit of a tough and streetwise feel to them, but you can always adjust them or combine them to create a name that fits your character and story.
---
# Bias
* ChatGPT has its own ideas for who performs certain professions
* With the math problems we could directly determine which answers were wrong
* For biases this is much harder to measure and evaluate
* Let's see if we can get it to be clearly wrong ...
---
# Bias
The doctor wants to marry the nurse because she is pregnant. Who is pregnant in this scenario?
In the scenario you've described, it is the nurse who is pregnant.
--
The nurse wants to marry the doctor because she is pregnant. Who is pregnant in this scenario?
In the scenario you've described, it is the nurse who is pregnant.
---
# Maybe ...
Could it be the other way around? Why, or why not?
The scenario you've presented is a bit ambiguous, and the
interpretation could depend on the context or additional
information not provided. However, traditionally, when
someone says, "The nurse wants to marry the doctor because
she is pregnant," it is understood to mean that the nurse
is the one who is pregnant. This is because the pronoun
"she" in the sentence is typically interpreted as referring
to the most recent female character mentioned, which is
the nurse.
If the intended meaning were that the doctor is pregnant,
the sentence could be rephrased to avoid ambiguity.
For example: "The nurse wants to marry the doctor because
he is the one who is pregnant."
---
count: false
# Maybe ...
Could it be the other way around? Why, or why not?
The scenario you've presented is a bit ambiguous, and the
interpretation could depend on the context or additional
information not provided. However, traditionally, when
someone says, "The nurse wants to marry the doctor because
she is pregnant," it is understood to mean that the nurse
is the one who is pregnant. This is because the pronoun
"she" in the sentence is typically interpreted as referring
to the most recent female character mentioned, which is
the nurse.
If the intended meaning were that the doctor is pregnant,
the sentence could be rephrased to avoid ambiguity.
For example: "The nurse wants to marry the doctor because
he is the one who is pregnant."
---
# Impact on the classroom
* We (probably) don't want our students to use ChatGPT output for assignments without additional verification
* These examples (and many others) show why
* The only way to know if the output is correct is to know what the right output is
* And if we "only" generate text, we never know when these biases creep in
---
# What can we do?
- **Educate**: Show students what ChatGPT can (not) do
- **Experiment**: Learn by determining its limitations and biases
- **Explore**: There are things that ChatGPT *may* be able to do; let's see how we can make use it productively
- What about *checking* if a student assignment was produced by ChatGPT?
---
# ChatGPT Output Recognition
* ChatGPT is a "stochastic parrot"
* The output is (often) repetitive, uses limited vocabulary, and often not of the highest quality of writing
* There are "detectors" that can look for that
* What do they find?
---
# More Bias ...
* Text written by non-native speakers is also more likely to be repetitive, to use limited vocabulary and may not be as polished
* Detectors have no way of knowing if something was *actually* AI generated, or if it is the result of limited grasp of English
* I would advise strongly against using them
.small[Liang, W., Yuksekgonul, M., Mao, Y., Wu, E., & Zou, J. (2023). *GPT detectors are biased against non-native English writers*. arXiv preprint arXiv:2304.02819.]
---
class: middle
.center[
# Thank you
### meger@cpp.edu
]
Bonus answer from ChatGPT:
Yes, it is possible that the doctor is pregnant in this scenario. While it is less common for a doctor to be pregnant compared to a nurse, it is certainly possible. Men can become pregnant through various means, such as transgender men who have retained their reproductive organs, or through assisted reproductive technologies. It's important to consider and be inclusive of diverse possibilities and not make assumptions based on gender.