Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is going to make things even worse. Now the greedy ones that want to use AI…
rdc_ljrs58c
G
For someone who is new go drawing and digital art, AI art has made it HARDER for…
ytc_Ugw_G3m7O…
G
what a ridiculous notion - where we have lived as humans on earth without AI for…
ytr_UgwP2rzlU…
G
I am reading the article you are referring to now, but there is nothing about ho…
ytc_Ugw9eYQ6q…
G
It always was...
Claude: The irony is almost too neat to be coincidental.
Bein…
ytr_UgyldAHY2…
G
Okay chatgpt, are you willing to put your life at line if you actually believe t…
ytc_Ugwwl8RJa…
G
Why not put a command line in the AI, that shuts it off, when it gets self aware…
ytc_UgwTUcatI…
G
Nuclear warheads melt the ice caps I learned that from Leon Musk.
AI is a comp…
ytc_UgwLuiotd…
Comment
What would we gain from developing AI consciousness anyway? An operating system as we know right now cannot do something that it wasn't programmed for so it demanding rights is highly unlikely as an AI would only capable of reacting with scripts that has already been applied for it. They wouldn't be able to think independently. Something that would come somewhat close to AI consciousness is teaching an AI loads and loads of different scenarios, sounds, terrains to scan and react accordingly, which would be insanely hard for every common/uncommon situation.
youtube
AI Moral Status
2018-07-29T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzIYxS08qchmz2JRyd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyoTnivxyrsFdclMCl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyhhLnriEfhjSjg_DV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxyv8eKXVrJw7IX2ix4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLWgCiRUIm5FTiFhJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxaT50AT7V_a4QiQV54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJlFAjfrm59b6jUlF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9cpps0dPfEvH0y0x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3LqSFJ3taQrWBivx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgykeDbi2q_d_VIPiDB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]