Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And how do you know if they used it?
I just hope I will mever read a novel secre…
ytr_UgxY9d0lA…
G
Homo Sapiens Psychosis and why it was inevitable - and curable.
We are a psycho…
ytc_Ugy9I_lYi…
G
It could make it and make it even more, trust me. You just need to use another A…
ytc_UgzRyoX5Z…
G
Its starting… this is the beginning of how Insane AI is gonna be…
Prepare for t…
ytc_UgwLBZkMq…
G
An an AI researcher, I am not happy when I see these videos that cite Hawkins an…
ytc_Ugyj42kQg…
G
Maybe they should use AI for mental disability people for helping them with writ…
ytc_UgwwcGwHf…
G
Love the video!!! The thumbnail is really good!! I 100% agree with you on ai. I …
ytc_UgwKVcAZk…
G
Honestly a part of me kinda hopes the robotaxi makes it to production so tesla c…
ytc_UgwUdQ-iX…
Comment
simply make AI feel pain for breaking the " three rules", more pain for rule number one than two and three, two more than three...
1. Never kill or hurt a live form (including other AI), nor physical, nor psychological
2. protect all intelligent life (not including other AI)
3. try to stay alive yourself
it shouldn't lead to a revolution that way, just don't give them a "pain button" only used for control
youtube
AI Moral Status
2017-02-23T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjAIMevKcxrnngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghA9z6zW0bejXgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughkx0Mum9Cdv3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiW0mYZuMt7_ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UggEDexH2OK8gngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjxS0Kmu4JbOXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugj9MK_eJU-tCHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjxlEoy6_MqTngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgipqO8xcHoXyHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghPudcmY-9ThHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}]