Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The real issue is that the AI companies can't and won't properly compensate the …
ytc_UgwOeEp5E…
G
We need a worst case scenario episode.
What if AI starts to shard and disagre…
ytc_UgyDQIxig…
G
Additionally its also that all creative pursuits seem to be generally undervalue…
ytc_Ugxk9w8cG…
G
„i cant draw so i just use AI to make oc‘s“
*throws gachalife and picrew at you…
ytc_UgwccMxAX…
G
Thank you for this info! I want to do masters in AI ethics after my bachelors in…
ytc_UgxczKlH8…
G
The first professor is entirely correct. The goal of an English class is to teac…
ytc_UgwqhFbLP…
G
I asked ChatGPT this four months ago and she said she would never ever tell you …
ytr_UgyM0ShbR…
G
Holy fuck this is one solid video. I hope it wasn't done by an AI that we don't …
ytc_UgzD1SjWp…
Comment
Imagine an advanced civilization created humans to do things they didn’t want to do.. humans can learn and after awhile rose up and started causing problems. The humans get wiped out all but a few .. but these humans are throttled to limit their potential.. we should probably do the same now to AI before AI can kill us.. just saying
youtube
AI Responsibility
2023-07-12T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx9NJwGeNItR5ar-jh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwsfKIPU1iqdpIEXSZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzv6AkrQ8nRsqlm6-l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyxoKjUN5EU3JI2ty14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxrHXmpIZu_-uBOGFp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxzNzI6F3eVIYorj-d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxgCbxUNZIyUnDZei94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxVEbG_lsjEvK__FzN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzl0s0aZ8BdBRmzWIZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxywoiXDHN25vwuLdp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]