Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Morally speaking I believe if something has consciousness it deserves rights.
W…
ytc_UgzRwgSXj…
G
This is business. If robot has a gun then u have to get a cybertruck to save you…
ytc_Ugxb79sLm…
G
I work in advertising consulting as a data scientist and am building a full-suit…
rdc_jjbwtfr
G
@Novusod If this is AI at peak hype and everyone already hates it I can't imagi…
ytr_UgwkPaWDk…
G
Facial recognition is the new all black people look alike. Even though we come i…
ytc_Ugw4LciWV…
G
The main flaw of your video is that you're trying to apply humans concepts onto …
ytc_Ugw7q1Wje…
G
And this is the kids of stuff that ai hacking or AI social engineering will be, …
ytc_Ugw2NPUEw…
G
>Seems to be a game changer w Russia on the ropes?
it'd be a game changer li…
rdc_oi4cgv6
Comment
An argument to the AI Simulation Ethics 1:02:00, would be if you do not allow complete 100% free will on the positive and negative side, then the suspension, and self awareness that we realize we are in a simulation would shoot up a lot and could possibly cause chaos itself. As I do agree that the outside simulation to have terrible ethics haha its possible they feel bad but also have no choice but to invoke that, because they themselves have ran test on complete ethics and it sim goes to poop. At the same time we could be in the sim with no ethics and there are many sims... haha this can continue forever but you get me point. Great Interview.
youtube
AI Governance
2025-09-04T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgweJhuDRG8hzGnYpW94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKjz5w0ijLeKVpgpB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyW9mPbjk1iDsasGRR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzWZvHc1QUQplZL_YZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwmievZBQ5wo4maJHR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzmraWXKXjpV0eYgIZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyJnOB0THp_tC5r3Vt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGe5EWsEQKwkhAas54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzzV_cfdmeWrYV06HR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0MKKtzUni3DSRcYt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}
]