Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We understand your concerns! AI can indeed feel overwhelming at times. It's impo…
ytr_Ugwy6Ema6…
G
Checkmate, he thought to himself as he leaned back in his comfy and expensive of…
ytr_UgyUVMvep…
G
Asking AI to make art for you and calling urself an artist is like ordering food…
ytc_UgzWW6tqt…
G
Real and then the AI starts saying stuff like "you start to feel bad for what yo…
ytr_UgyKpMVI1…
G
I saw one of those driverless trucks on a cross country highway on a road trip o…
ytc_Ugy6BrXKV…
G
If it isn't AI or something paranormal, I would say an emaciated animal. If para…
ytc_UgydTlXKv…
G
I was like that. I totally believed that I just wasn't talented, and COULDN'T ma…
ytc_UgwZG0jrz…
G
Can we get AI to iron his shirt when it comes out of the pack 🙂…
ytc_UgxHCGzyg…
Comment
Why would artificial intelligence networks want to kill human beings? Couldn't they just do computer stuff all they want and simply ignore us? How are we getting in their way? If AI systems reached their full potential, would they find themselves getting bored and start picking off humans for sport? I don't understand the rationale behind our fear of AI, either from the human side or the computer side of the equation.
youtube
AI Governance
2025-10-11T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxpf1V4KGrYSEU0NSN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwyG_-tRlzuDBmkWs54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQ9BOTWNHTYkDJjlZ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyxonn5M3RB2buNf7p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwiZVWB18RG484rO-N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxmx5L2sjaWLfqdS594AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx7hFRVfHzGOeZlS5Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzM6c898zfs2oqNen54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy296Vjvl44o61aZwV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7y18r8_Fk03ZlTxV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]