Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think it’s quite possible that AI will destroy itself but I question whether a…
ytc_Ugx3Huie7…
G
So, ChatGPT was pretty based. I really don't know how you take the opposing view…
ytc_Ugx86t3tT…
G
hear me out: Manufactured Imperfection, not for the AI but for the human being u…
ytc_UgzyBk64d…
G
It is fascinating that Elon has yet to innovate abundance in an era of technolog…
ytc_Ugxl4XojJ…
G
Not jumping straight into their traditional school work but instead starting the…
ytc_Ugw3_yTJ0…
G
Open source stuff is, as the name suggests, open source. that generally means th…
ytc_UgwUt3Wwm…
G
I get where you're coming from! The conversation touches on some complex themes …
ytr_UgxmXZ-TP…
G
He was 16. Have you met a 16 year old?
At 16 they should be driving, socializi…
rdc_nc41gjc
Comment
Solution: Don't give them pain. It is unnecessary for their purposes.
So any engineer who gives specific-purpose robots (like a toaster) any form of pain or emotion shall be punished by the law.
As for general purpose AIs, they shall be treated like humans.
As for virtual AIs, it is Okay to add any kind of suffering or emotion as long as the AI has no connection to our universe.
youtube
AI Moral Status
2021-03-22T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxsMF7gHjK3qAe0zHx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx57xXCi7G6Tls6YWJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyR5qQOvSQA-NTOowN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxw8StzeSLoStQR1M94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzoQPjd4S86Yxge-1N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6Md4b-IXOcksjYRt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyxmys-7RfiOjkkDAZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugyia_exCG29isaYiuJ4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzvKIDAtBjCZfdNnl14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcwkaOfmNA5P2SnI54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]