Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
12:38 I don't want to get too dark in the middle of your scary 😱 clickbait
This …
ytc_Ugzfz6ujf…
G
How come we didnt preempt this sht? I KNEW it would be shaped this way bcz of th…
ytc_UgwM0WcUo…
G
That alignment problem, where we can be sure of making thinking beings that WON'…
ytc_UgwRUYnFo…
G
@gopnikgamer8207Come on... Even the youtuber admitted it was beautiful.
The rea…
ytr_UgyEY2coI…
G
what sad me nowadays are the countless channels pushing out psychological advice…
ytc_UgwYQuXlD…
G
Don't worry, My generation will be fighting AI war and went extinct not your, ju…
ytc_Ugzhj-jXz…
G
How many crashes have happened with teslas "self-driving" mechanism, where the f…
ytc_Ugyw2sMvg…
G
You technically can copyright an ai image only if it has your human intervention…
ytc_UgwPAuXIR…
Comment
If AI is built by a human and trained by a human, of course it will act like a human.
Humans imagine, they hallucinate, they tell lies, they have social expectations and social
status.
AI is doing extremely well at learning how to behave like a human, and that pisses humans off because it is so good.
You wouldn’t ask a librarian to diagnose your illness because she has a building full of books, you ask her where to find books about your illness.
You always ask the doctor about medicine.
A.I is way too human to trust.
youtube
AI Harm Incident
2025-11-24T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5LaNm7X3RDPpiXMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhH8I5ritVhzHhEWx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcAtNJ-bSgbGAIzf14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxNuiGv7CKrFC92Eh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKSY54hfpkg0Rns4B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOgxT20DMSyRiL__l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxdMq6ObQ1Q_LHQh1Z4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgziQkgRUpiHOBUQdkd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx4F-J-jgpgrx0cu054AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_9Ai-JSWtTeEB5HB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"}
]