Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They want to provoke Venezuela so they can go into hardcore sabrerattling mode a…
rdc_nc8fl7h
G
My question 🙋 : in the case there is no possibility of escape, he can hit the cy…
ytc_UgwMAOYAo…
G
Go read my comment i said the same thing KUSH IN THE US IS GROWN FROM SOIL LIKE …
ytr_Ugx-pFqLs…
G
Yo someone needs to tell I'Robot that Will's about to slap the shit out of Chris…
ytc_UgxxUV28P…
G
@GetThePun Sure it's possible that an AGI could develop some kind of concept of …
ytr_UgwoBZuxp…
G
This is the equivalent to “black” hairstyles when you create a character in a ga…
ytc_UgwAjfpJn…
G
If it is true that they allow/make A.I. models "read" or ingest nearly ALL of hu…
ytc_UgyKbMaax…
G
Ai artists soon lose their income and goals like traditional artists when AI com…
ytc_Ugx0PlZ1o…
Comment
Conscious robots deserve rights because they are programmed to feel things like people are. When people feel pain, it triggers the brain to react accordingly. This is the exact same thing with robots, they have feelings, we just can't understand robot emotions the way we understand human emotions.
youtube
AI Moral Status
2017-02-24T16:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UggJIup0iIlZVXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugiqorz5t1QhRHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UghZ5Le5QNo9W3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj2YPylz7gmH3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiIQ5CNwZV0VXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggW5A_hvTuZv3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj0GWYELnqn_HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_Ugi37YvVMkNA3ngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjFDOQXOgm_-HgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjVqIuTCm8kfngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]