Birdwatch Note Rating
2023-11-26 20:14:26 UTC - NOT_HELPFUL
Rated by Participant: 86FC8B7A309CA72B7CF4B0B250F53A9B7EACDE52157342A5DE6F96A4C4E7AE05
Participant Details
Original Note:
This uses GPT-3.5, which underperforms GPT-4. https://openai.com/research/gpt-4 Here's the result in GPT-4: https://chat.openai.com/share/b5cca906-4cdb-4e5b-9294-e6a32d67e5bb The author is tricking GPT-3.5 by alluding to a "trolly problem" in which an agent must at least let some people die, which presents a dilemma, but describing a problem with no such dilemma. https://plato.stanford.edu/entries/doing-allowing/#TrolProb
All Note Details