Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
cant they allredy make a robot look anybody ,kind of like a wax museum reound th…
ytc_UgxvSpcXf…
G
It’s absolutely not the same thing. I’m very interested to see how the Disney vs…
ytr_UgwK7K_6n…
G
I remember one of your previous videos and how passionate you were about your di…
ytc_UgyeoiN-O…
G
Ok so, what if we make a "#" to boycott sora ai and other ai things #artistarise…
ytc_Ugzf6786o…
G
Why they tryna make things complicated. Just look at pictures yourself. Not an A…
ytc_Ugwvz4_1F…
G
In my opinion
The only ai art that would be good
(I don't think this is current…
ytc_UgxrTs7qH…
G
AI is going to cause everything to crash .While big corps are focusing on saving…
ytc_Ugy38QYL_…
G
How are these fossils going to regulate AI when they don't even know the differe…
ytc_UgxoVIsuO…
Comment
The fear isn’t that AI will disobey us, it’s that it will do everything that we ask it.
How much worse will the already abysmal wealth gap increase because of this technology? How many jobs will be lost? In our current society, where having a job is tied to healthcare, shelter, transportation, and well-being, how is this going to help the working class?
I’m hoping I’m wrong and Neil knows what he’s talking about.
youtube
AI Responsibility
2025-11-20T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwwkOrKX0I4sRA41CR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxm7KXDV7-nrm_5Z6t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_82_QfTn6Ocubhzl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyjmKWTRi30rqyxy3J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyQ0XRkZyiufKB2oIl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz0RJQYGSKSNgCHAZ14AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxXT7PwZfQGZxNsNJd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyeDXD9x-_lVhRFFkV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzCgRbX6txKqEBJvr94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVUL5IUdPLBXoO7iJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]