Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone thats heard part of the AI disaster & truly believes this can be extr…
ytc_Ugx6FcUoi…
G
Most don't have a clue yet of what's coming with the rise of AI. Especially in t…
ytc_Ugxfbj5ii…
G
For factory assembling why would we replace it by AI? We already got children fo…
ytc_Ugw1EZRae…
G
I will fight against artificial intelligence as long as I am on this earth, arti…
ytc_UgjhwwXIc…
G
It's not - it stifles innovation, and welcomes tech stifling command economy. Tr…
ytr_UgyBzkGdZ…
G
Every time AI says "I understand your perspective" that's a nice way for it to c…
ytc_Ugypzdx3x…
G
I think it's more of like AI is still a giant unknown and depending on context e…
ytr_UgzUhPhEy…
G
If artificial intelligence nuked the world it to would die. How would it survive…
ytr_UgzNnA7jD…
Comment
Look, the AI apocalypse sounds fun… I was into the concept too… The AI Doc dropped and I was convinced… yet I think the optimists make very reasonable points here. The jump from our current models to super intelligence that is indifferent to human survival is bigger than the pessimists claim. The big qualm there is “indifferent to human survival”… why would it incidentally wipe us out if it was more intelligent than us?
youtube
AI Governance
2026-04-02T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzKfivHWSk0Dwdd_1d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz5oVq_dnYTOV5GvwJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_I89CCuX4lbHiYwl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy1FZyaz01FfMHXs0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxfMrz-hb4gOV-2xKd4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzGp1kL-p4RD6ag_VB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugznxqv7m5QAnbnbj2Z4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwB7BdoPtrsLRjoXJ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwwBc0zawLvxRx60Tp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwrQIzG6DaFPe13nzN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]