Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You're getting some hate, but I think you pose some interesting questions.
I t…
rdc_djzgptf
G
This absolute nonsense. The superintelligence and robots "that can replace any h…
ytc_UgxeNSg9F…
G
Isn't humanities enslavement more likely than extinction? Would the slave-master…
ytc_Ugx1fYEf0…
G
Artificial AI has an artificial mind and consciousness, shared from humans, the …
ytc_UgxkF8muT…
G
@seanfaherty there’s some.. questionable.. art for sure, but I’d rather let the …
ytr_UgzgOSQj5…
G
the unanticipated effect of increasing the time of production because AI is choc…
rdc_n5hjdpd
G
If these A.I. ceos wreck the world economy, the People will take care of them no…
ytc_UgxObpgAz…
G
"It's a sentient AI."
"That's not possible. That's against policy!"
Life finds a…
ytc_UgwKXnrEl…
Comment
AI COULD be a normal technology. We COULD just be normal about it. But the problem is that we AREN’T. We’re taking evolution and how it made us, and putting it on EXTREME speed. And soon enough, we’re going to accidentally evolve something that takes over everything else at the expense of those things. JUST LIKE how evolution, without any intention to, created Homo sapiens — a species which has gone on to largely accelerate the extinction rate of other species, drastically altered the environment on a global scale, and is a massive evolutionary pressure, thus accelerating how quickly evolution happens, ALL ON ITS OWN. We do NOT need to be making another US.
youtube
AI Moral Status
2025-11-02T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxzZlWOFypP3fXmqbJ4AaABAg.AP19-RtHrAtAP1CTcndVM4","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwccqDfXKUFdMg788V4AaABAg.AP14Wa8ICubAP179pWhzgp","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwtMIIULS0qkUbxGxl4AaABAg.AP14Bbw8mduAP1nY2eZSvD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxnO6auS0yaYgzQgPB4AaABAg.AP0s-NegLT8AP5JNbh0GPE","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgxnO6auS0yaYgzQgPB4AaABAg.AP0s-NegLT8AP5Jhn4gUBI","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgxnO6auS0yaYgzQgPB4AaABAg.AP0s-NegLT8APC17X9JfsA","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytr_UgxJ0_RUc38ucjPngSV4AaABAg.AP0kx0X657sAP3PPzyPb4H","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxJ0_RUc38ucjPngSV4AaABAg.AP0kx0X657sAP3uA_eb6h3","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytr_UgxJ0_RUc38ucjPngSV4AaABAg.AP0kx0X657sAP5CQgfz-HU","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxJ0_RUc38ucjPngSV4AaABAg.AP0kx0X657sAP5Ep--JXwi","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]