Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It seems like you're asking about something specific, but I might need a bit mor…
ytr_UgwuvxGum…
G
Only reason why company's are pushing for self driving trucks. Because they can'…
ytc_Ugwc1ba3q…
G
Neural Network Researcher here.
Nightshade doesnt work. It can work, but you wo…
ytc_Ugy0X3sl1…
G
What was he doing to get shot twice tho, im not about to jump on the troglodyte …
ytc_UgyvVITcT…
G
@Mary-o9o12 Sure. But we have automation in 'customer service' systems, and hav…
ytr_UgxWYEf4u…
G
@Blue_Nadesand the crossbow required less, you've completely missed my point. T…
ytr_Ugws0mrGK…
G
Because governments can only use force which is at the expense of freedom. Priva…
rdc_esr91da
G
The job market is so bad, I dress up as a robot and speak and robot voice and u…
ytc_Ugx6eR117…
Comment
I love how this video falls into the SAME fucking trap EVERY discussion on this topic falls into. The "Humans are special and different than computers! We have free will and a soul!" No... no you don't. The brain is just a VERY VERY *VERY* complex "AI". When it comes to man-made ideals like "rights", our brains are literally no different than an AI given enough complexity and computational power.
Meanwhile everyone in this comments section keeps talking about how "superior" humans are to robots. Yeah. Take a look at our history, actually *know* something about neurology. Until you have done both those things, shut the hell up.
youtube
AI Moral Status
2017-02-23T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghUUTPKDH88uXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgieKUMMYUHrjHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggNXuR9Uuu-dHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj0k6LggtSpPngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghIMGtVMpeoXngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UghsHvsEZa7QpHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugh-wxpuQ7IOdngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjx2gfLE92JJXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi_RzdM3NNBsngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Uggfa3awuUzm_3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]