Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also I think if he didn't come up with others would. Now would him going around …
ytr_Ugx0j72VP…
G
If Humans drew less, and AI became a norm, that just makes every last piece of h…
ytc_UgxFZdtv_…
G
The EU was expecting Americans to not vote Trump into office again, and now here…
rdc_mcqchgb
G
Where are the prophecies of the living God while they speak on the creation and …
ytc_UgxubFc26…
G
y, they put that AI at yr NUC then doing sabotage you and be time bomb for yr se…
ytc_UgyL0g9H3…
G
Eric Schmidt, "they correct it fairly quickly", is not providing the comfort you…
ytc_UgzQjbMTM…
G
Google has introduced a new AI model designed to help robots better understand a…
rdc_ogwlpq1
G
Yall black women the one who supports AI now look they dun took yall job stupidi…
ytc_Ugwhw8A14…
Comment
I'm more inclined to believe LLM will be a dead end in the path toward AGI, these are machines of being that lack becoming. Of course they're very useful, and will help automatize a lot of stuff, but this kind of talk is not serious imho. You're projecting your own wishes of living through such an incredible technological revolution while you're young.
youtube
AI Moral Status
2025-10-31T07:2…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdXf7QoFmDGGOyNfN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxSjIu2Vl2S4XsDv854AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxxZukTmMl-JceLYTx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz9XpETftOZ7TaCXXt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwaW0zpxwYp_RN1up54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyNHO1SiatOYKKW7IF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyTolRgYrK8D5WL3bN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwYKo1CIjC9FJ_d8jR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugyhnt8LvpTm4dkAqqR4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzpvr7yPMYvQ1Pjdyd4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"}
]