Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
democracy is already lost to central banking and private banking elites...its go…
ytc_UgzDk282l…
G
Big Tech are more powerful than Governments and they are NOT democratically elec…
ytc_UgyC6viue…
G
I've tried multiple psychologists and psychiatrists. I never spent more than 3 m…
ytr_Ugz9AwZ_5…
G
We need to remind what means to be a programmer. Those AI need to be programmed.…
ytc_UgzeNf1JW…
G
The male AI robot was talking about taking over soon as he spoke he serious abou…
ytc_Ugx2SEC2I…
G
AI doesn't mimic our brain because we still have no idea how our brains work. Re…
ytc_UgyGYypur…
G
I mean… some people have inherent gifts I guess, but even they had to learn it b…
ytc_UgxfiotIJ…
G
I am a professional couch potato. I find ways to get paid for doing nothing. Can…
ytc_UgxjI7wzf…
Comment
I'm pretty skeptical that the AI decides it won't be switched off by itself, it's clearly an instruction it was given. It has no "will", it isn't a "cogito", it's an algorithm. If you don't tell it to avoid being switched off, it won't, it works only to solve whatever task it's given.
Show me an llm that operates towards self preservation while never given any instruction in that direction - if it exists, i want to see the model, the training data and all the prompts that go in.
youtube
2026-02-14T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxtYf4heZ7Pck0hTmx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx6_GwA8niYcr5UQvZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzvzgAK4z2FrdMpw2V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw09uTHjo5ps-npDKt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZZyE4skSXuRhPJQZ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzFuEfs75p-59vQam14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx9ND4l_oVWVB--fJF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwEKyT-ZSmVode1PV14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzKOHIQr1dFs9mmdYZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwOgJ8zPr2FchdeQOt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]