Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There was a kid at my high school a few months ago who got suspended for deepfak…
ytc_Ugw0gV_7v…
G
You can't tell me he didn't get AI porn of his colleague on accident. That's jus…
ytc_UgwaPtV6B…
G
The only reason that companies want AI is to increase profits for humans. It's i…
ytc_UgyzMevyx…
G
It was like this before AI in STEM so not much has changed lol. Schools, IVY or …
ytc_UgxJySar6…
G
If we don’t talk about deworming to eliminate all the immensely unnecessary medi…
ytc_UgxczOdfI…
G
What made us think we are not already living in a universe created by the stage …
ytc_UgwGXRiF1…
G
I asked AI… how many school shooters % wise are white- it said the number isn’t …
ytc_UgyW7JFx8…
G
If you’re dumb enough to fall for ‘train your AI’ put your halloween suit on ‘ y…
ytc_UgweCiqF8…
Comment
I can't believe people don't understand how wrong this is. AI will never have "Consciousness" so it can't be good or evil. If AI does something bad it was programmed or prompted to. But it will absolutely be used more for fraud than for good.
youtube
AI Governance
2025-07-31T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwmvKlgMntJmBTJ7Ad4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwqClAY1lhmt-0LObJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgziXugU3Ufv6oMgz2l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzUugiAXM43hbbmJhZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxi4IGBb6wiy8PhQBh4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxctBYCdW_zD-txoEx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwJ051cfcgiPJt2TL94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzjDBInmOTX9h1YGah4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyhYAiAy5pLis3BJPF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxcXQz9eM7b2I4j5Vh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]