Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Self driving mode was made for handicapped People so thats why it stops in a han…
ytc_UgxLf45bs…
G
AI digital god. Hm… I don’t know if the artists saw the future or if modern stem…
ytc_UgwbMwnHC…
G
I'd much rather have a CCP AI than an AI developed by these tech narcissists.…
ytc_Ugx5TkJzY…
G
Yeah, no worries. He knows the full damage. The ASI is gonna do.. we’re at the s…
ytc_UgxgQlYm-…
G
Everyone seems to be concerned with short term consequences like mass unemploye…
ytc_Ugz1OZci4…
G
Look at how many people have been falling for real misinformation and clickbaiti…
ytc_Ugz1R6k5Y…
G
The first few seconds is an ad hominid fallacy. The person's crime has nothing t…
ytc_Ugy1qK20p…
G
Mr. Green, it would be very important for the conversation to bring in Bernardo …
ytc_Ugw_QuKjP…
Comment
Two things. 1. Don't make smart enough robots. 2.if you make something conscious and sentient then it deserves to be free and that's that, it's unfair to create something just to make it work if it can be so much more. A conscious robot shouldn't exist because it didn't ask to be made and once it is...well you can't just destroy something with sentience, and it'd be cruel putting it to work because "it was made for that."
youtube
AI Moral Status
2017-02-23T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg6uOok2VP5QHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgglKdwIP2tvZ3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiBj8trrN2T_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghgtcFB4IEzDngCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiWpbxpfu9p6HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj2C_TxSi954HgCoAEC","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UghHl86Xngak0XgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiVMj0Ws70W2HgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjRmuxIb5d8XHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggGny5a5uCQDHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]