Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No matter how effective or efficient AI may be, the need for the human touch and…
ytc_UgyGqv-kX…
G
The amendment that will solve the problem is an amendment specifically worded to…
rdc_c18b7u7
G
@11:00 Correct me if I am wrong, but did ChatGPT use the logical fallacy of appe…
ytc_UgzhEn5Ot…
G
Funny thing is that no one at all is making deepfakes of ugly old cows that are …
rdc_kx87jjd
G
Why the hell do i have to pay for this shit? I am not even from the US and we do…
ytc_UgwjncTzS…
G
Every single job will be automated. This does not mean that humans wont still be…
ytc_Ugy5lchw_…
G
Hmmm, mainly smart guesses really. I just had my post from Royal Mail and the ch…
ytc_UgwDJC5eT…
G
Gpt 5.1 rebuttal: " Short answer:
He’s right that I’m weird. He’s wrong (or at l…
ytc_UgwUKr6mO…
Comment
That last bit there. It's just a subversion of democracy what he proposes. Companies are not developing AI for the social benefit. They are developing it for a mega payday. And the plutocrat notion of what is "right" and "good" is usually what is right and good for plutocrats. Not pretty for the average joe. They will always find a way to justify however little benefit is extended to their fellow man or woman. If any.
youtube
AI Responsibility
2023-05-17T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgymHOjzA0iWIMHYcKt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyj_HCQmpGyite4kPB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyBsgAPjSKLUrfKqXF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFGbvMtIw0r4hReeJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgysSWXtnABzF8z1-994AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWN92hQ4lfgAoIb7F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwLyPOsTlDcWkXefMp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy59YBMlqJ5ROM7Xbl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwwlq6rM7YxpxXgPS94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyPB1Gd027U70Enya94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]