Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm surprised there's no mention in the article about the fact that white castle…
rdc_j3zcldf
G
If automation eliminates most jobs, who will have the means to purchase products…
ytc_Ugwi8_5KG…
G
whatif some guy wants to make a deep fake porn using his real face but a hotter …
ytc_Ugw9WX4Ne…
G
No, it's really not remotely at all like that. AI really is becoming as intellig…
ytr_UgycaVVPa…
G
I got some great afro beat songs I made, The key is to actually write the song y…
ytc_UgxIxkpAj…
G
AI wont help the economy or the rich if there is no consumer left who spends mon…
ytc_UgwqWAHNB…
G
Nice. That’s the trial run and they’ve proven that it can work. Soon those OpenA…
ytc_Ugx7n87rN…
G
Alexisthebest ever Yes, but what if that task is to kill anyone who could remote…
ytr_Ugghx3Nm4…
Comment
Let’s be clear AI don’t think, AI doesn’t feel it only acts like it because it was made to try and act like a human but that’s about it; they are not conscious, they won’t be conscious but that doesn’t mean the consequences of acting like a human isn’t real
youtube
AI Governance
2025-06-17T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz0dKrlNIJhbhGmqKp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx8Tv2eQ2CXG7aHA9J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzoWrWElP6IsRTidrN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznXgzve8wzenf3ZP14AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzEJxou91xxweCwK9B4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyFhCBSeDD7FJ0U_k94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgybTeEMO4_dYftMI5l4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxTz9qvpNFQ0sEUNwN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8-xT4c_C4wmx6a5h4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmBDz_aVirpsWEirR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]