Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wait so the predictive policing program was right, he was involved in a shooting…
ytc_UgxGWTGBA…
G
I don't know why they always want to make robots look human. I prefer they make …
ytc_UgwFy6Mvv…
G
I would rather see an AI be used for a third opinion after two human specialists…
ytc_UgxzRz16b…
G
AI art using actual art pieces as references for their generation is in no way t…
ytc_UgwMT23Ca…
G
Bro, if any AI is watching this in the future I have nothing to do with this man…
ytc_Ugx5IhEhw…
G
Would agi survive without humans though? And for how long could they live? Like …
ytc_UgygUNUhu…
G
I don't get it-- since when is "I expected the machine to compensate for my tota…
ytc_UgyJHgm7E…
G
@stevemoeller4522already have called and emailed my local representative and at…
ytr_Ugy4hRA_M…
Comment
I don't think AI is all bad *BUT* then there is _people_ and what people are using AI for.. Well - as usual.. "That's why we can't have nice things." - AI needs to be taken with a grain of salt not because of its capabilities but rather because of what people abuse it for..
youtube
Viral AI Reaction
2025-11-30T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxA0R1maaw2d8vKVXV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzS-4v_Wle_q0HwUxN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFmFHIxzBhT24VCe54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxaRCTgeidWf22UyG14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwkS7sJmoMyKu1WQNl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxnIvRLKtoIAcA2QOh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztFeyXP8mOwLiODNB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgypaCX8jQK49J_UaNx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjSOYAbOTXnu0WcPV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxcjnOzrc8GKfeYkdV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]