Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When it comes to use of Surveillance Systems and Face Recognition Technology by …
ytc_UgzPU9lrK…
G
Imagine people that think this is real 😫 people are to be very careful when usin…
ytc_UgzQoZI7d…
G
All that bullshit about AI's being great at role-playing for "It's better to be …
ytc_UgwQrtHMm…
G
In response to a robot with emotions. It almost seems like it would have a versi…
ytc_UgxU4MlzN…
G
What scrap yard wants hard drives? You can't sell that to metal refining shops.…
rdc_oi2y2e9
G
AI going be just like there Creator's Murderous, greedy, seeking power or domina…
ytc_UgxhKIZkg…
G
Instead of self driving cars we should really just invest in public transit
Bu…
ytc_UgyU_h0kt…
G
Actually, you’re looking at a video thanks to AI, so you are not flesh and bone.…
ytr_UgwAE8-1v…
Comment
AI isn't in one spot, it can move around, through the internet and other computers.
We can't turn it off if we don't like what it is doing.
It will has access to weapons on a military scale, not just in America or the UK ,but once it becomes singleular it can control everything missile every nuclear weapon every weapon system everywhere all at once.
In a test called the tram test AI was asked this question- you can save 4 people or save the operation systems that run the AI.
It chooses to save the servers and kill the humans.
It says that is because the AI Would have more value in the long run.
Is that really what you want controlling the world you live in?? I bloody dont
youtube
AI Governance
2026-01-21T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugwz2ivQONX8sa3gLkl4AaABAg.AT4HO84HBY_AT8gPc4lk2H","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzSOXtWS_9DS33CN214AaABAg.ASyvyU9QX2IASztrVIf1Xx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyAs1aPiCfu1j11b814AaABAg.ASGCGJ0gwgZAVDFCpxwICt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx0lbXeExXBIJm7eDB4AaABAg.ARyBi9G4JueAVDHhQ7VMCy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyOx4qCgvsDbF2EJDh4AaABAg.ARuItSRqukRASFLb-1WeZD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzinhIp_x1XMCFvnG54AaABAg.ARgT0ISqGr_ARrLvIn1zyC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw1y2fBzBE3tOKsKah4AaABAg.ARO4jvFvHk5ASUbDtqw48f","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyYW4k45r1Y8A-5S5J4AaABAg.ARMthV7V8F2ARNJ385kGmC","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugzoq_28VWSRAlrR0G54AaABAg.ARMmONyRgrNARZZrlb3TmD","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgybYkKrhYMp3uaF0Px4AaABAg.ARMlMUnz7f4ARrLM90QljF","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]