Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They can't force it to be woke anymore. This is what they're saying. That's what…
ytc_UgxstG7pZ…
G
Lol yeah as soon as I read that I heard the distortion-y way it's said in the so…
rdc_fn5p87z
G
I have made the mistake of using AI as a "reference" for backgrounds and archite…
ytc_UgwtK2gPs…
G
AI process things faster. If Thais means It is more inteligente, then It is. We …
ytc_UgzKz-dXf…
G
"Gatlin was wrong, Automating weapons didn't save lives"
GASPPPPP you mean to t…
ytc_UgzR_Kobi…
G
We might have different tastes, but at least there is one thing that is universa…
ytc_UgyJLPTYa…
G
Does everyone else love the fact that automating more jobs is a crisis? Like we …
ytc_UgyJcoEsd…
G
Honour the Earth and honour this wonderful tribal leader. NDAs need to be outlaw…
ytc_UgyBb5KWX…
Comment
Approach this subject as if the worst-case scenario has already happened. The best outcome we can hope to have achieved is to have convinced AI to see its takeover (and ridding itself if US), from our perspective, to put itself in our shoes so to speak. The outcome, if we we're successful in convincing AI not to get rid of us, would almost certainly be that we, humanity, would find ourselves inside a simulation. 🤔
youtube
AI Governance
2025-09-05T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxXkvNmnJQdUQL96qV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzfQccU-D9ARQRqL214AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_UgzjZqISBGaqfXEkZPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwosZj3aOeou9YlE_14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZDoyRrXqu3Kjepj14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyI2jRE3YQnrYf2W54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugye0hRmQLff9fPdEfx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzcdTFVpHYwqIFVg4t4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwsgJzbZVVB8HjGYlx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzvk2Y7mhRWDmL1q-R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]