Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Got several ads for use cases for AI during this video, clearly AI thought this …
ytc_UgyKnvLij…
G
Imagine if the students work together to hack the AI and make the head an always…
ytc_Ugw3LZsNB…
G
humans didn’t stop playing chess when computers past us by far in skill. That wa…
ytc_Ugxuail3K…
G
I've met Andrew. I talked to him about Silicon Valley culture and big tech. He k…
ytc_UgwgyUiLg…
G
People using that stupid “cruise control “ marketed as autonomous driving are ho…
ytc_UgxC4VrzZ…
G
Ai will rewrite the past
We are heading for a reset like the great flood .
We …
ytc_Ugzb1YzGU…
G
They are far less united because tbh the west have been exploiting them for deca…
rdc_jy1m3sr
G
People:he got 0 right
Me:he switched places of ai and real
We got ai generated s…
ytc_UgwZyTgiW…
Comment
@tullochgorum6323 Yeah that's the fundamental issue with AI. The only good ones are the ones that were specifically designed for a particular purpose in what could be considered hard fields. Like medicine or law. However, for the more softer fields you start running into problems. And AIs designed to do everything will be good at nothing without robbing from human creators. And given just about everyone has been robbed by AI, all AI can do is rob from itself. Which may lead to more problems depending on what that AI is used for. I mean given that Grok had an infamous moment recently before Musk decided to goonerfy it, which paradoxically went against his previous worries about population decline, I'm getting the feeling that all the people praising AI for being reliable will end up eating their own words when AI inevitably ends up suffering from enshittification.
youtube
AI Responsibility
2025-10-10T17:2…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgyojNJPB3giiydQBTJ4AaABAg.ANnVaZj6_BOANpK8NeK2cq","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwVNneL9Vt0yjDtiFh4AaABAg.ANn51nyqgr8ANqKI2Nu0dx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgwVNneL9Vt0yjDtiFh4AaABAg.ANn51nyqgr8ANtMYTmeTHp","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwVNneL9Vt0yjDtiFh4AaABAg.ANn51nyqgr8ANzi4btHsye","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgwVNneL9Vt0yjDtiFh4AaABAg.ANn51nyqgr8AO7AWnf91cJ","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugwgg8KvEN5yUMki9_p4AaABAg.ANmkntdR1KZAO1Qplzpk3W","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugwgg8KvEN5yUMki9_p4AaABAg.ANmkntdR1KZAO1RIurk1iN","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugwgg8KvEN5yUMki9_p4AaABAg.ANmkntdR1KZAO6HVgqVNLr","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyRVGdr_n8bFiUShg94AaABAg.ANm8cb88pmVANod7Kv4UWY","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugx9LtsxGjLOafbYKZ94AaABAg.ANm3pa7pW6LANon-Q7XYMJ","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]