Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Quiz Answer:
A - Sophia is the first robot to ever be given a citizenship.
B- Ko…
ytc_UgxhD-BEm…
G
I think The inner voice telling him that his neighbor is trying to poison him co…
ytc_Ugy1FWZ7o…
G
7:02 making a driver responsible for paying attention odd not the manufacturers …
ytc_UgyTv1eVo…
G
There needs to be a required disclaimer for any video that includes AI. Those th…
ytc_UgyoASftd…
G
I mean most of them wouldnt be harmfull in the right context like asking what me…
ytc_Ugw3QMxIr…
G
Ai detectors piss me off more than AI. These things ARE AI and are telling me th…
ytc_UgwfHYti-…
G
Its just marketing technique by elon to create hype current ai is not on a level…
ytc_UgyKRXXS5…
G
It was the same with car manufacturing, and probably a whole load of other indus…
ytc_Ugy5oVLtw…
Comment
Humans never found moderation to be satisfying. Life will find a way out, as they say. The one good thing that might come out of this is probably humanity unifying to face AI doing everything efficiently, even if we don’t reach ‘Terminator’ level threats before we destroy ourselves by climate change.
youtube
AI Governance
2025-08-05T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwybKaa44ASv_3scfJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwyvxRkfhEbVZgN-0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyeX0PHz4_pToeLsN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz4QWD1_5wSvPyHOz54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwweEVXyikQt7EkvnF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy9kxgssC7uRpmsQHx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8WlaqcvYAj7bkDKx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxNBVDmbExrztIZMO54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLP982bcAr0uk7Gqp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx7GvMoRoRMVgDtRFZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]