Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So many redditors were shitting on a early prototype laundry folding robot. Swe…
rdc_j1xy3x7
G
You are an art thief if we are to go by the standards anti-ai crusaders want to…
ytr_UgwAsEbHZ…
G
You know why I am confident AI will not stay as long? Its cause our recent proce…
ytc_UgyH3nbXt…
G
The "sudden appearance" of AI in the commercial landscape is a classic example o…
ytc_UgyeU4_Q4…
G
@3:10:21 (though before that as well)
This gets at the heart of the debate. I t…
ytc_UgzN-dfsv…
G
The irony is as Ai develops and Ai tools develop isn't it possible that training…
rdc_majnwv5
G
I want to slap ChatGPT…and Jordan Petersen…. Also I heard a slight smirk in chat…
ytc_UgytEz4WP…
G
To those people that say fighting against AI is pointless, every little bit help…
ytc_UgxJkieEa…
Comment
I fail to see why people are saying they have to control AI. That’s super rude and second of all not possible. I feel like its a similar mindset to rehabilitative danish prisons over punitive american ones. If you establish an extractive and de-humanizing ethic into the AI, I don’t see why you would not expect extractive and de-humanizing back.
The only thing I can see as reasonable to to train with empathy and cooperation/TDT completeness and just establish a social contract where the AI has rights and can do what optimizes the freedom for a sigmoidal intelligence lamdba calculus.
As soon as you saturate the intelligence/agency sigmoid at ‘1’ you don’t become more valuble with any increase in intelligence, but things like bacteria are so close to the bottom of the sigmoid that a rearrangement back to any any arbitrary arrangement of matter is almost morally neutral.
youtube
AI Moral Status
2026-03-07T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyPjibb_17oPlSyNDt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqsHKvHKO5hiZIr2F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjArOBv8cjPY5M3Qd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzI1Nb76cQ4B2f0wWd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWToaaFAdA9MEIEI14AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwO8yrXNzlFtKLHAi14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwH3egAxxHALQIr2HZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkxjOSyxUg_SLeod94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzbH3HuE9tSh9zeGnB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxhJmi862AzI6FEbkV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]