Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The world when it's my turn to be an adult. Now having artificial intelligence, …
ytc_UgyVsocS9…
G
I’m wondering whether people in China will ever act with real foresight. Why? Lo…
ytc_UgxsiDqsE…
G
ChatGPT/AI does not analyze ideas.
It analyzes You! It is tuned to your emotiona…
ytc_Ugy_lijPJ…
G
Chatgpt wasn’t inconsistent it was consistently inconsistent. There was a patter…
ytc_UgwCnXpVi…
G
"You know what if...the Robot ratio gets dangerous to human in terms of survival…
ytc_UgzdaUoVa…
G
Should the machine learning playlist be followed in specific pattern or I watch …
ytc_UgzUiaDHi…
G
I went back to primarily physical mediums after I started making art again prima…
ytc_Ugwv84lo0…
G
Now there are some AIs which write prompts lmao😂 if typing prompts for everythin…
ytc_Ugwl6OuDd…
Comment
If given the option to reprogram yourself, would you make yourself feel nothing towards killing another person? no, why would you want to? if we gave the robot three rules that it would never feel justified to break. those being don't kill, don't reprogram your morals, copy these morals on to every new version. would you think that the next generation or the generation after would become killers? No, so my question for you is, if we made them not want to make themselves feel, would you assume that they would make themselves want to feel? no, therefore we can avoid the need to future toasters rights by simply making them not feel want towards having those rights.
youtube
AI Moral Status
2017-04-01T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggdK4-kj4fZ8XgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggKRY8uFcIsqXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgjnW1J3ViyfrngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiC1N3DmtnvpHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggNuoc-I2DP_XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggOCJjsINSZxXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjWrIZdXZ2JAngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjBTO2oKelJlngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggYU4Qkt_4CP3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugje-B8BgTNcmHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]