Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just had to share this with grok especially after I had a roast battle going b…
ytc_UgwzXrqxp…
G
@duluozah you can own information... that's what copyright law Is and ai violat…
ytr_UgxAZYOwr…
G
Google doesn’t care about AI ethics because it endangers their bottom line, plai…
ytc_Ugyuj26H7…
G
I was told art can be anything an artist wants to do. I'm assuming that also mea…
ytc_Ugysw4Xr7…
G
15:24 not going to lie, self driving cars I’ll be happy being able to have where…
ytc_Ugy_9_RHw…
G
Just asked chatgpt why people are so polite while interacting with it, here’s th…
ytc_UgwwsuOpP…
G
They pretty much are unless they throw out UV light or have Infrared sensors or …
ytr_Ugx72UQjh…
G
now the real purpose of Ai is revealed. Install Ai onto everything and then hack…
ytc_Ugw3a-pEF…
Comment
I believe everything has a conscious. A pillow has one, but since it has no external senses or "brain" it has the lowest kind of conscious. a robot has an okay level of consciousness. it (probably) has external senses and has a "brain" but not as good as ours. Humans have the highest level of consciousness as we know what mirror does unlike most animals and know about our surroundings, rights and wrongs, and about ourselves. If robots ever achieve our level of consciousness and hurt someone, they have to be punished even if they don't feel pain. They know what they did. unless they have the mind of a toddler.
youtube
AI Moral Status
2017-03-19T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxOz32Mqir4mx9Q7ep4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHnrQw-5aECr2Zvt54AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwr60lU1uhM2pDe5bp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiDEONyjhXpZiAX214AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzSJnuzoFnHlr7OVHB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghQiW0rduVSOXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugh6hVu_9ssjf3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugh91X5m-k7M6XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UghfBFxixrIHDXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjPKZV0GM_N-HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}]