Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its so scary with AI, i had seen something like ai audio, someone was using AI a…
ytc_UgxMToVHQ…
G
It's Apophenia .
In a sense by nature all humans are slightly schizophrenic , li…
ytc_UgxN8IUuD…
G
Three and a half years later, and Enrico Coeira’s discussion on this topic is on…
ytc_UgwvlLRYC…
G
Same as Instagram explore. The more I report “not interested” the more content s…
ytr_UgyvjD9EY…
G
As im educated in both ComSci and visual and music art. I support this, i don't …
ytc_Ugzrkhybb…
G
No one wonts any agency to form the future. This is totally brainwash. People wo…
ytc_UgxVuarwF…
G
You're right that current AI has real limitations - it makes mistakes and genera…
ytr_Ugx8ne7XJ…
G
shad xqc and asmongold have never made anything of any value which is why they l…
ytc_UgyaCPIHk…
Comment
It will be easier than with animals... we created robots - we dominate them. Even if AI gains self-conciousness.
The world is not "perfect kind place". There is no true justice outside of "human space". We just force to reproduce and then butcher and devour animals because we randomly evolved to be required to eat animal meat and wear their skin in order to stay alive and function properly. This will change never while we are bound to your mortal fragile bodies with electro-chemical "wet" consiousness.
You kill animal, eat it, feel pleasure and have no harm for your health? - You are doing it right.
You deny robots becoming higher level kind species, feel great and have no harm for comfort of your life? - You are doing it right.
youtube
AI Moral Status
2017-02-24T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugixaj93h0Q5xXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghRdXH0RQOp8HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh-SjVAq9zdx3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggZCcJUMZuFnXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugg4gAkvZdg7z3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj9wGJPXC_hu3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggIZ1W19SNryngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgicOMwNotsRh3gCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBXLlsrBabe3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggXJxSl99YodXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]