Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You BETTER Have a KILL SWITCH REMOTE. For ALL A.I. with a WEAPON . or WERE DONE…
ytc_UgxtkagFH…
G
@lenonxay2011 So just say it's a deepfake, dumb dumb. It's so easy to disprove. …
ytr_UgxYU1LYk…
G
I saw a comment on another video that basically went along the lines “if you use…
ytc_Ugxm2w06R…
G
Guys, are you really happy about a small group of multi-zillionares throwing the…
ytc_UgxSzpRjY…
G
Both fortunately and unfortunately, AI will not kill my job any time soon. Even …
ytc_UgwAwHtJ-…
G
It's unbelievably frustrating to hear these people claim this is "pure evil". Ch…
ytc_Ugx1gd7fb…
G
Don't agree ai is just a tool if you against it then stop using brushes or moder…
ytc_UgxjQfEpM…
G
Everyone reading George's 1984-"that's so scary and could never happen
Every on…
ytc_UgyaW3Kfw…
Comment
To everyone who thinks social media companies shouldn't be held accountable for creating addictive applications: On one side of the screen, there's you. On the other side of the screen are teams of hundreds of algorithmic and UX researchers, studying every interaction you have with the app in order to squeeze out an extra second of screen time from you. If you have self-control to beat them, that's amazing. But what about the younger generation? Should we really bury our head in the sand and just blame their parents for bad-parenting while these companies invest hundreds of millions of dollars into coming up with more addictive technology to get these kids hooked? This can end really badly. It's time to be proactive and hold these companies accountable. We need algorithm transparency.
youtube
2026-04-13T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwlEpRGghKRQ630H6p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyyFr6hAjdTYTKLBot4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyUCs_HZ5U5Z6r72OJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxBe2ZFOlJBsrSjkd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyRegkhH0A2xDE2HU94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsuYvfz3PoQTyQEdF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxa1cJXY0Jh4Hivk294AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzme0S2vh3h_qvr7wN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyEGgHs1DwLhb_J1rt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyusRp2tOmOd6tEl6B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]