Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Isn't it the opposite though? You just said AI takes their knowledge from thousa…
ytc_Ugzq_MI7s…
G
Since we humans can't get past outdated democracies and such type of social situ…
ytc_UgzjwGkO0…
G
That an AI corp (Microsoft) is actually ending up getting some users to break ND…
ytc_UgxE-VV9D…
G
In the US, businesses are allowed to deny service to anyone (so long as it’s not…
rdc_ohxzikx
G
Glad to see the majority of comments being realistic. AI isn’t going to benefit …
ytc_UgwipAibj…
G
Imo we are already under control of AI,computers,tech or whatever you want to ca…
ytc_UgzukDpM8…
G
Not worth it. Instead of people getting excited about AI, it brings more worry. …
ytc_UgyX4r0fE…
G
NBC reporters do a fine job blurring the line between real and fake every day- w…
ytc_UgzNXfQFu…
Comment
AI will not replace anything lol. Let's just differentiate between reality and corporate hype where corporations lie in order to make more profits. AI, for those who have studied it, has a fundamental problem which is hallucination. Also, it is not creative and needs data training to extract patterns to follow. Without that, AI is not meaningful in any sense. All the articles of "AI is becoming sentients" are nonsense, unproven, corporate hype, and deliberate lies.
youtube
AI Responsibility
2026-03-19T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_w3gupnaqWxLCw-54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMTdEYhGDBa1U-i2R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzEyDLZwR3e8NGg02B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgycR16IkMQHLlVk2hl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNHo5KnZeRQRmEvBx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVyM2sxpbwudFfblh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugykc3RO2ljbzK2eCuZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzzyz0fFqHEZ-g5Trd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxuk6MUPDWc1dWHonN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwaH5y5I8JVz2_Uish4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"}
]