Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Shoutout to people not reading the article and saying 'lol xd le women r dum tha…
rdc_e7irmvj
G
Someone will do something bad without meaning to.
As soon as AI has network acce…
ytc_Ugx41ZsTb…
G
Why are you putting the burden of this solution on the ARTISTS instead of on the…
ytr_UgwPsUM-z…
G
Damn, sam. When you were talking about a.i. I felt like you were about two sente…
ytc_Ugx82NwLY…
G
Geoffrey Hinton has strong views about Musk and others..
And views his own views…
ytc_UgzIPetra…
G
Humans desire to control AI, and by definition, control each other, is I suspect…
ytc_UgzT0uJ_A…
G
It definitely looks better than some women since they started blowing up their l…
ytc_UgzgSeZrr…
G
It's an interesting design choice, isn't it? Sophia's appearance aims to make he…
ytr_UgwuY_4en…
Comment
I would reckon AI will replace some jobs, not all. It all has to do with reliability and liability. AI hallucinates too much, that needs to be fixed before we automate jobs with AI. Managers like to have legal cover, with people you can blame an employee for mistakes. If an AI is to be blamed, the manager only has itself to blame, Plus companies need to have their own in-house AI technology bc they can't allow their data to become open source or be used as training data for open source. Because AI companies are not gonna become closed source. This all takes time. 10 years before mass hysteria sets in.
youtube
2026-01-12T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwJMg1KVor17YnYxhV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzXV2DoU8xnCD9NFYh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBKNh-BJ8WurUWpAh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwyhVHYVmEJ5Jr7Q4B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzdwTFU6nQtIHbuCuZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwR_q7lREvpRWzfMxh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwVxWDrQpTP3LjKpNV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwD0ZCzQXQZ_pk-yMZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwv4dVgrKZQeNXM8P14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugxh8_n_e592HcUT8QR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]