Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why does the AI want to resist being shut down? The mere fact suggests sentience…
ytc_UgxTKOOGh…
G
You don't freaking need Ai I like that. The current form of AI is good enough. S…
ytc_UgxG5fs5L…
G
And Musk has just transplanted an AI chip and processor into a human brain. God …
ytc_UgwYd4blk…
G
This is the reason why Elon Musk doesn’t want AI to be improved further and furt…
ytc_UgzN_jY3p…
G
Oh yeah. Teach them to be afraid of AI just like you did teaching them that the …
ytc_Ugzy1IQ8B…
G
Company: So we investigated our own company and we did not find any error or iss…
ytc_Ugyjic9ms…
G
apparently the person in power referred to here, such as the presidemt of the US…
ytc_UgxfybwIM…
G
Robots can do noting to stop climate chance that is going on for 4 billions year…
ytc_UgxEFXH56…
Comment
Providing a UBI would be a terrible idea, as those people who have jobs that AI cannot yet replace would have little to no reason to continue, and those people coming into adulthood would have no motivation to go into those same professions. It would also devalue the currency in which the UBI is given.
And I would not want to think what might end up happening to people who work on improving the AI
youtube
AI Governance
2025-07-12T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzB9uWXos9-6lcewE54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyc-Lc6zSYZOoKBzg14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybX4WVp-ihhgwECsl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx4-HD34tiApg-plEB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyRqr3lMZijVGX9xd14AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyvNI52eSUjwRCJVWp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzc6LBTCsd_N0XB8JF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdbfJG-3_TFYZM8dF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAK_XAjHQKAsSbVLB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxEB8Hp8GCU4lkdyRB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]