Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm disabled and was an artist. I had to stop as my disability got worse.
No. AI…
ytc_Ugz-onM-j…
G
It sounds like Hinton feels he's been misrepresented in the media.
>In the N…
rdc_jihjffx
G
People are using a language model and believing everything it says when they're …
ytr_Ugyj4adA2…
G
I really don't see how they'll keep humans hostage or kill us , they will always…
ytc_UgxYXBZoz…
G
No matter what they tell you 2 things cannot change - 1. Autonomous "drivers" wi…
ytc_Ugzs-pSNp…
G
as a professional in the mental health field, my job is unlikely to be taken by…
ytc_Ugzhq0Ie5…
G
Nice targets to be destroyed when humans can’t agree on peace.
It’s all too eas…
ytc_Ugy2B_Gx1…
G
I want Fully Automated Luxury Space Communism.
I do not want Fully Automated Dy…
ytc_UgzhVoE_6…
Comment
There simply has to be a manual override or way to terminate an AI like HAL 9000.
youtube
AI Governance
2025-05-23T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzaIMIHReEDMImKywl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEHMeks5sr2FIgBhl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxao9NwHv-iNsnnlr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwGXRiF1_7tZOnexs14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-FSleuiO0i5ih5uV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqXbkevsi6Dtas9-d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwhxvnmg7mXiXfrV0l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1SLU2BSm531PgowF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxDp0uqlJbhTHnzp554AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyGNZhOxJN4kYmxC3V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]