Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem isn't AI, the problem is midwit techbros who think LLMs are actual …
ytc_UgwGZwtV-…
G
I'm not agreeing with these "AI artists", but the answer is, cause it's free mos…
ytc_UgyLrXkat…
G
The issue with the deepfake problem is exactly what Scarlet said, it's impossibl…
ytc_Ugyk3GvDT…
G
I totally believe this. I know some people will say AI is still not there yet an…
rdc_m8adww5
G
The most probably doesn't have any role in this or may be , who knows. But 99.99…
ytr_UgylVM_Mj…
G
First of all you can manipulate Ai to tell snythingh however
The real point is…
ytc_Ugzf_Dw_q…
G
To me AI is just a cool trick it's not magic it's not the future just a product …
ytc_UgzFePHab…
G
AI should be generally used only to an extent. AI replaces humans and makes thei…
ytc_Ugw-f2O6I…
Comment
Aauuggh 🤬 I saw this, read the referenced NYT article, and sent off emails decrying AI research cuz of bunny-boiling Sydney and the "dark side" of AI. Meanwhile, it isn't HAL or psycho AIs but (as I always suspected) Humans we have to be terrified of.
I read the full transcript of Kevin Roose's conversation with Sidney and he only quotes the wacky parts of their conversation. Of course he does. Who the hell is going to blame humans for annihilating the planet if we don't point to other suspects? Besides Hecklefish...
youtube
AI Governance
2023-07-07T05:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxePb5diqL3xiBvP6F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuhTSucMWjUaXN1-F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxJ4iJgMEBce8R2YBt4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwaRZGtpTIR7iIjIuB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-hb_drdA3Y7oirGN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxbf-KUgkHNB3bP4k54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweZHA3mWu_b-Dc-CR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyQnBro5pK_L-iBHsd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz43aJp_VBILpjCOnl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCOHClH7k4AW_r4ZF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]