Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have been confiding in chatGPT for two years now and one thing that I'm proud …
ytc_UgzSWVa_6…
G
I find the "um"s and human-like stutters/repetitions the developers have program…
ytc_UgyRdA5I-…
G
@twostate7822 it's level 2 because it can be activated anywhere, and theoretical…
ytr_Ugx7Hf401…
G
Regulating this will be like chasing a fart in Wyoming. The best thing to do is …
ytc_Ugzsj_1RM…
G
“It’d be naive of us, Mr. President, to imagine that these new breakthroughs in …
ytc_UgxdzJXBf…
G
Manager: "Ivan, I'm gonna have you fight a robot that feels no pain, and hits w…
ytc_Ugz2r0eoF…
G
Why does America always make out that Russia and China are the big bad wolves in…
ytc_Ugz8dNkJC…
G
>Or they’re from people that have zero experience coding
This is the general…
rdc_kuonvq6
Comment
The truth is, we’re building AI simply because if we don’t they will. And that is the terrifying thing. We have already lost control. Because how do we stop if we don’t they will?
Say a catastrophe happens, now we have to stop building AI. But if we don’t, they will and then they will win. So we have to continue, catastrophe or no. Where is the end of if I don’t they will?
youtube
AI Governance
2025-12-04T16:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyWSOb65xLaLFvSQSN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyzNpjhizW_lNZuBMp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwkjc4M1L79sxwQRNd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwt44sB9-fBFi4-kSB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzHovDIDzDfQ-zgFp54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyAvYlFpB6OmsRNvhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwucTteIj2AJ0BTPQJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxUGl9uMX0fpc7MU4Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwn-jdIvoLBiautlyV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgweSvlBT2Er0xdFKxh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]