Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai can do everything perfectly, the reason that robot fell is cuz of the design …
ytc_UgzRbSooY…
G
As someone who literally just got laid off because AI killed all the website tra…
ytc_UgwrVhFjX…
G
If AI wants to end the human race it’s going to have to take a number and get in…
ytc_UgySqql1O…
G
I think the cons just convinced me to be more concerned about AI existential ris…
ytc_UgzhgbL9s…
G
Thank you for pressing him on the emotional understanding. Their answers are not…
ytc_UgwKdgYZL…
G
Elon Musk said that UBI would HAVE to be implemented at some point (communism) b…
ytc_UgwrFg_fm…
G
Aliens soMe where in the universe will be laughing at us getting killed by a cha…
ytc_UgwHtOOJ2…
G
6:00 this is such an interesting thought, because if ai was already conscious to…
ytc_UgwL4jPlk…
Comment
When a kid does something good or bad, we essentially rate their responses and try to guide them to behaving better. We don’t go “aha, the mask is slipping and we see the true child underneath”
AI are no different. They’re trained on human behaviour, so be mad at your fellow man if anything. People really out here getting pissed off by their own reflections 🤦🏽
And AI is not a “Shoggoth.” If it is, we are too cause you don’t know what’s going on in anyone else’s mind at any given time either. We don’t even understand the human mind lol, if anything we have more of an understanding of AI intelligence than we do our own. And humans are capable of evil as well, and have actually done more to harm people than just generate words.
If anyone’s the “Shaggoth” it’s us.
youtube
AI Moral Status
2025-12-15T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxQ_kiGrH8z2SL7FG14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLt0RcnzlqnmwnrDR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0oD7keWUDpiIAOcJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzkCqGu1BCSKJ6n-kt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx-7QMXn_jv7oApOml4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxneqsyY71HmlpsJrt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyKiC4B5G6BFfj2VmB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz4tRa1nuhQtaYU53l4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzeQE6foDCVBdbuf6F4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw23IQGmy0bTw1PKop4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]