Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No, Ai destroying humanity is sci-fi clickbait, it’s dumber than people who were…
ytc_UgwxWPJ_6…
G
I'm sorry to hear that you didn't agree with the content of the video. If you ha…
ytr_Ugwq_MRtZ…
G
I've only ever used AI for art as inspo. I'd have some very specific idea stuck …
ytc_Ugxts0TWn…
G
Wikipedia pulled the plug on using AI for content creation because it made too m…
ytc_Ugzu3Q6gf…
G
i would love to see AI enslaving humans with brutal and wild violence of an auto…
ytc_UgyzFdh5p…
G
i asked snapchat ai questions, it gave me stupid answers, i told it i wouldn't u…
ytc_UgxADLyZp…
G
Universal Basic Income. Next trend the Next Big Thing. Ai can make it possible. …
ytc_UgwHXoczY…
G
AI extinction is not 1:100. According to researchers it's somewhere between 15% …
ytc_UgyWbXLhC…
Comment
Consent is a big thing. Humans know when they need consent to use art for inspiration, and know when to ask for it. AI does not. Besides, part of what makes art art, is the emotion and intent behind it. AI has none. Artists know the different between being inspired by something and copying something, AI does not, not can it apply inspiration, it can just copy and paste.
youtube
AI Responsibility
2023-01-12T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxm0t0ZI0Xc72jEKFx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxp3IfXGj4fB4KI6qB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzlunT35mztAlu9PmR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxJCN7TWWvA5NeGCdJ4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzh2U36vX2JSOTZHgV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgymHrgBHP2PJN374b94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwOcptUilcQFORvy7p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzCO3ES8Gz30Jt4JR94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy59wryd6kPwjXlY6t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUGP4LObWCiK4C7g94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]