Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is going to destroy ALL life.
Stop using it right now to save the world!…
ytc_UgwGWXrmy…
G
Corporate now: We need a lot of software developers to enhance our AI to be the …
ytc_UgwfmFiCN…
G
I learned fairly recently that one of the most accepting definitions of what can…
ytc_Ugwk6NZoU…
G
@sainulabidhka5624 your ignorance is pitiful, that is not an incomplete sentenc…
ytr_Ugw04kw0r…
G
AI doesn't need self-awareness or personal experience to have a point of view th…
ytc_UgyTrnarD…
G
You gotta face up the facts. AI is a really great tool that could set humanity f…
ytc_UgwkegKnq…
G
@AndrejMejac made a comment saying "Learn about using AI ASAP. Make a company…
ytc_UgwGdhlzi…
G
@himmelsdemon I couldn't find any. And the answers that were not AI generated al…
ytr_UgxMrPD-w…
Comment
I'm all for Ai, humans in the majority spend there time killing eachother, hording wealth and resources and keep people sick and in horrific conditions in the name of money, greed and power, it's seems its more the people who live in a priveliged bubble, free from pain suffering and inequality who seem to be the most worried about Ai, hopefully A.i levels the playing field.
I'm happy to take the risk, I'm majorly sick with a serious health condition, not being cared for by the medical industry I have payed into my whole life.
Bring on A.i as the human race I've come to know is a sad state of affairs
youtube
AI Responsibility
2024-12-16T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyIg1wYSStfyDhxvnx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzd7RdrLeMk6Wppe_V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxxGzSFs7dpmgLS-mN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyFW0jQcWqyghK93et4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzQQkHCJak9LzGoWA94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWDguM5O2Sjv1GRKB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzxXsRxxQyLT6f7rLF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9A9adDeSAFDujNk14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyIdw6DkNbEBt4_p-J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYQEHMGoPKuyLabQl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]