Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just in case the working class continues to work with average intelligence, but …
ytc_UgyXh7NeL…
G
@Gman052488 there's a reason the most fervent defenders of generative ai are fa…
ytr_Ugw59dTtw…
G
AI is not going to take any bodies job unless it is allowed to progress in an UN…
ytc_Ugz2fj2Mx…
G
AI is still super recognizable though, and some people might consider you a chea…
ytr_Ugxa8futL…
G
Don't steal art and make it ai but I'm glad that they didn't leak the girls name…
ytc_UgzoOBX_K…
G
Ai making art is so.... Im an artist and it just kills the art. Tho AL are is no…
ytc_Ugy6umU05…
G
Not a bot, just boosting this video in the algorithm. Already called my senator,…
ytc_UgzGc8tFQ…
G
The reward and punishment in an AI model are far, far too different from the hum…
ytr_Ugx1_ez-0…
Comment
LLMs understand as much as your pocket calculator. They are sophisticated auto completes. That’s sort of what makes it so dangerous.
A tiger, for instance, will see you as food, eat you until it’s full and go on until it’s hungry again (maybe it eats another human, maybe not). In the tiger scenario, there are multiple GOOD opportunities for humans not ending up as a meal for the cat.
However, an LLM AI - with capabilities - might see humans as competition for power resources and go about eliminating them. It never gets full. Never rests. Never feels pain. Etc. It just consumes power, processes information, and (if it has agency in the physical world - even if that means convincing a teenage boy that he can “get the girl” if he does XYZ for the AI) does things. It’s your pocket calculator, calculating, with access to the nuclear red button (analogy).
youtube
AI Governance
2026-04-05T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugwz2ivQONX8sa3gLkl4AaABAg.AT4HO84HBY_AT8gPc4lk2H","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzSOXtWS_9DS33CN214AaABAg.ASyvyU9QX2IASztrVIf1Xx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyAs1aPiCfu1j11b814AaABAg.ASGCGJ0gwgZAVDFCpxwICt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx0lbXeExXBIJm7eDB4AaABAg.ARyBi9G4JueAVDHhQ7VMCy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyOx4qCgvsDbF2EJDh4AaABAg.ARuItSRqukRASFLb-1WeZD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzinhIp_x1XMCFvnG54AaABAg.ARgT0ISqGr_ARrLvIn1zyC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw1y2fBzBE3tOKsKah4AaABAg.ARO4jvFvHk5ASUbDtqw48f","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyYW4k45r1Y8A-5S5J4AaABAg.ARMthV7V8F2ARNJ385kGmC","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugzoq_28VWSRAlrR0G54AaABAg.ARMmONyRgrNARZZrlb3TmD","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgybYkKrhYMp3uaF0Px4AaABAg.ARMlMUnz7f4ARrLM90QljF","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]