Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
https://www.youtube.com/watch?v=z87Plv52ma4
I just wanted to share my experienc…
ytc_UgzXP-OAb…
G
@CaligulatheEmperor i mean about these other things you said dude, what tf a ga…
ytr_UgyA6oS10…
G
Great. Now religious idiots have thought chatGPT about religious nonsense and it…
ytc_Ugz4zH0wZ…
G
The quality and effort put into your videos exceeds me. Talking to AI in public …
ytc_UgyjwKTmi…
G
Jimmy and the other guys on the show are technophobes, and out of touch with the…
ytc_UgjCW6xPk…
G
My gf: so..I found your character ai chats..
Me:..I’m sorry..
*machine gun noise…
ytc_Ugw6vRE7m…
G
@event__horizon yeah in 2026 its already gonna be voted on. I will bet on it. A…
ytr_Ugxkc8jJb…
G
I don’t think sentient AI would want to kill people. AI gains functionality thr…
ytc_Ugxxz8k-K…
Comment
My answer to your question is NO. I prefer convos to be short and sweet. Now my questions;
Let me begin with some uncomfortable questions;
1. In the first place, I wonder why these Scientists were awarded the Prize at all when they claim to have predicted
the disastrous consequences of AI?
2. Though the common man has NO say in the World's most coveted Prize, I am curious to know why is John Hopfield not called as "Godfather" of AI when he shared the Prize with Geoffrey Hinton?
3. By the way, how did the world forget John McCarthy, Alan Turin etc. who should be called "Grand Godfathers"?
youtube
AI Governance
2025-07-03T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwgaIXeQ9zpmnJTR2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsVPtcQXkfioJ0fSB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7VsXvGEwsJL6BuJp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKqGo3Q5v9-eoIc9V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz080KjOalv8ThUUoh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzY5EfT48fg6vTzWOh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVv0vueSzBI2d2Aop4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxib9CH84ZbPjwyO6x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwx8ZT_z6-Ewi_FVrt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMaAxsOMdtNBVGRW54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]