Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Last year, through school, I participated in a project that involved creating il…
ytc_UgwUGWGW4…
G
It needs to get a sophisticated remote control compulsory for each A.I. creati…
ytc_UgxOBY_rj…
G
Saying you’re an ai artist is worse than saying “yeah I did this art! I did it b…
ytc_UgxIi1JE7…
G
i mean ai arts is good actually
it literally learn same thing as us (take a loo…
ytc_UgyvIa1pf…
G
It’s like walking. Are you born with the talent to walk? Can only some people wa…
ytc_UgwALhWF5…
G
@sinoptikshey. quit it, we all know that real animation is better than the AI sl…
ytr_UgwNA6sra…
G
Some people will just choose not to use any "tech" to do ANYTHING. AI 'could' j…
ytc_UgwdhpmRa…
G
We ALL know this could happen, but people still made robots because well the con…
ytc_Ugynn3inH…
Comment
It just occurred to me the connection between nuclear proliferation and AI, like the manhattan project pretty much was betting that 1. Their bomb wouldn’t ignite it the atmosphere immediately, and 2. The people they were developing it for would use it more responsibly than their adversaries. It feels like a miracle we haven’t blown ourselves sky high tbh. All these existential threats we play with casually is fucking insane. We’re like a phoenix building crazier and crazier funeral pyres and just hoping they don’t ignite.
youtube
AI Moral Status
2025-11-17T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzygqCafbRLsp9Xr194AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugywgk6du9hbvl99LO94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyl3QgrWOTtl6hKe3R4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPh9ySYWWVptvVjrF4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyyxM9y89cm6W4WC954AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4E7InsIdi_3w7hNB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwyn9yX1AMEJtOc7114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9JSmCZTyTbp2N4NZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyfrNfhl5S1I770on14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwmRXUeGPtQkWYsN-p4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]