Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People were using AI to make fake P*rn videos or something similar using a lot i…
ytr_Ugxlt8fPI…
G
because its not real art, its already so hard for artists to make a living and a…
ytr_UgwUScXi9…
G
I think it’s important to say this also goes for AI voice generators. Unless the…
ytc_Ugx1YW_Dx…
G
Honestly I think that AI is just another step in evolution of life on our planet…
ytc_UgyJ_IjIy…
G
People are very weak, and Ray Bradbury, who wrote many books and short stories i…
ytc_UgyDvtWOo…
G
Elon musk has no moral compass? How wrong he wants AI to be good for humanity no…
ytc_Ugyb-C-1m…
G
Lets see a computerised machine take out a motorcycle engine(random bike)as well…
ytc_UgwUOTUCN…
G
AI and the robots are less combative. No call off, no lunch breaks or breaks, ar…
ytc_UgyLqMcbU…
Comment
The imbecility of the situation lies in the fact that just now, when the foundations for the development and establishment of legal limits for AI are being laid, the world powers are increasingly divided. Thus, reaching a common denominator on this issue and on other, even less important ones, is difficult to establish. It becomes possible for AI to be developed chaotically, by each, as they please and with more or less elaborate safety protocols, ending up with scenarios from the "Terminator" series being part of our future... that is, if in the meantime, Putin doesn't "accidentally" sit with his ass on the nuclear button.
youtube
AI Governance
2025-03-03T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwWeeQePirj1f7cTot4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxKXNz5gRxOqBZ_rXt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzAZj3y621e8xlqXZN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzBgYzO6fj_JRLrTfZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxEwV91SerMuBGlqR54AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCveQGZwESEMNQcUV4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyxTY_sBzYR4k6mXPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLVP9-xRJrUc7EAT14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzk_MpsMBlPPYrwE154AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzFGcA_0H73twQGdRd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]