Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@gyperman3751 so why should we care about you? You oviously don't care if our jo…
ytr_Ugy2xbJi_…
G
Wrong. All medical staff will be replaced by robots. Humans will no longer trust…
ytc_UgzkU5J6E…
G
Well them it's good then that the person who was getting flamed for no good reas…
ytr_UgyOjYOs1…
G
I think this AI thing..that you people are talking about...does NOT exist. It's…
ytc_Ugy5iGRg8…
G
3:05 …I assume you’ll get to this later in the video lol, I’m running an LLM equ…
ytc_UgwACwMGX…
G
Somebody might have to work on the robots to keep them going. The billionaires a…
ytc_Ugxdt18nc…
G
Every Photo Was Ai And Even You Are, Nothing Is Real. Maybe Im Not Real…
ytc_UgwcrLMu8…
G
There is no doubt advances in technology have improved healthcare tremendously o…
ytc_UgzLZDICQ…
Comment
According to Roman, if we did truly live in a simulation, why should we be concerned about AI getting or not getting out of control? I.e. we don’t know which outcome is the desirable one (for the simulation to continue) or whether either outcomes is relevant at all in the grand scheme of things.
youtube
AI Governance
2025-09-08T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy9SnQmKT0aNjbONYt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw7KeUxE0lrngA9y-x4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_86EHfeBKD7iCS0l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4AhwjSFzNR0D112Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzTU5kDGOi1cmJAZTF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyGjfwdqa5v5fGoyB94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGGs8Eap3mFUrPDEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxg6_Xlwd7bR9QKlmh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFTckaKCurmO6pvqx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw4StSooTxiNS9BT7d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]