Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not necessarily my friend. They're talking about something a lot smarter than us…
ytr_UgzSD9DMz…
G
If we don’t urge the government to make strict policies around the % of human jo…
ytc_UgxBsXP7m…
G
its a trick question. its both of them, reality itself, and the pancakes in fron…
ytc_UgzRVSH3f…
G
The difference of A.I and human is life. A.I has no life and therefore will neve…
ytc_Ugy9ML5Xi…
G
The best part is all these artists are doing is bringing more fame to the AI pie…
ytc_UgxoMYui7…
G
They can track you online too. A.I can even identify you through your search eng…
ytc_UgwH6WE80…
G
I think this guy is a bit out of touch. And of course he could say he stopped wo…
ytc_Ugx5fA16Z…
G
How about instead of ai art, how about ai taxes, ai dmv visit, ai needlessly for…
ytc_UgzdMoWhC…
Comment
14:00 Very comforting to hear that, not at all disconcerting.
Edit: It just feels a bit like we have the IT equivalent of a singularity at our hands. We do not know exactly what it *is* but we still try to use it. It's reckless.
Edit: This comment isn't a "OMG AI 2027 is true we're doomed" comment but a "Oh no, we allocate billions of dollars to building a tech that we don't really control or understand." This has also been a fear with genetics and nuclear science back in the day, and while they were often overblown they weren't unwarranted.
This feels worse for me bc the US especially just doesn't care about regulation in this regard. Neither the market dynamics nor the recklessness employed by the researchers illicit hope, and while I don't believe in HAL9000 ruling us there's a plethora of other ways this could fly in our face, even just economically.
youtube
AI Moral Status
2025-10-30T19:1…
♥ 29
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzXQDoG0C0LquHgCF14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwfjDggoc7slJUJxvN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyQuuXlK4Ljy7WoQCB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzWno767nWZhBfYPcd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy7t7R2SUYJnnO6qUd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyFOX5i3109Sdv6ljZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVLbha4pRgsIC1gzp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyvOThYlzE_Z8WEtC54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzs4vcCZ_FVBwIsJ194AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxmSgNf0R1NLcYg0HN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"resignation"}
]