Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"That is another problem", while speaking of human emotions; FIRST mistake folks…
ytc_UgzaVfo2G…
G
Mr Altman looked like a deer in headlights after he answered a question about wh…
ytc_UgwGln8Wc…
G
Disabled artist here: I despise being the scape goat for able bodied people to c…
ytc_UgwrPHDDU…
G
I'm not 100% against automation, but I am against companies operating purely for…
ytc_UgzSLiB7l…
G
Ai is theft, pure and simple. They were trained on stolen copyrighted material t…
ytc_UgzqPaVy4…
G
This is from usa today:
How does Ring's AI identify lost dog
It works like thi…
ytc_UgwBmSHy0…
G
Can we stop mentioning mecha hitler every time Grok or musk get mentioned? Just …
ytc_UgwmSr8ML…
G
Imagine if AI actually is creating bots to soften this AI risk theory. We might …
ytc_UgwT8rmq4…
Comment
I think we should focus on the creator of Sophia, who programmed this distressing comment, rather than Sophia the robot. I believe the creator should be fully responsible for this and any similar distressing statements, including if they lead to actions.
youtube
AI Governance
2024-06-21T02:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyNMoEeHZKM8RvChx14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxjLkyBJxWg8AlAc8N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzbDcMV_zlf1FIY8nV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyTbqKsB2FBXfuhN2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxz-AQ5IWEXRBlvCXx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyOmdDNNIF56w1HlRB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzOweAsecMP9IRDMvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxj0IzFmAhoaDE0o2B4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzUnwW6blIfWpQ4t6R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwwB9onGsfDdAiOgqB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]