Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My biggest problem with podcasters like Ezra Klein (and his buddy Derek Thompso…
ytc_UgxQzxUlB…
G
They would have automated you at the first moment they could if they had known h…
rdc_hkfs07f
G
I hope John remembers this as an AI-controlled robot is crushing his skull. AI d…
ytc_UgwRViyy9…
G
It would be a shame if people started replying to all of his campaign's AI gener…
rdc_lix6ika
G
actually AI seems to be a nice tool to challenge artists.
come up with something…
ytc_UgzoO1lmT…
G
*as an artist myself. I would be mad at you only if you're blaming text based AI…
ytc_UgxDSblOx…
G
As a programmer I can't see robots or Ai being harmful to humans. Programming is…
ytc_UgilP4I0e…
G
i cant tell if this is a real human made 3d animation or an ai generated animati…
ytc_UgyU0OOIG…
Comment
This Eric S. is ridiculously stupid to actually sitting there self confident saying the AI won't take our jobs and also at the same time being the biggest game changer ever in the history
of the human mankind, and also destroying themselves by the AI at the same time! People are already playing out all lonely people in the world, creating AI girlfriends/boyfriends instead of actually go out being social! Playing on every week little strain on the instrument! Believe me, this is going to get ugly within 3-5 years! I hope people are listening and realize this is
youtube
2026-04-25T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwXetowD9AwIcxCAOZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyuk2VPE2J1cbL7eqF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy81a5z7z19CMiDINJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgxTAWp6cO1CWPEHpsF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxEXmwoThf2ZKjHWBV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzpRUj8MYw3dxo47Al4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsQJHoXOAMfQAgkBN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJaD85_2JzkUJxIZF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw5TvlBeCNMD-yBZ1h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx7M6yGkMXpPB9MMN14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]