Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Copilot makes so many mistakes no matter what model you use. It can’t do anythin…
ytc_Ugw1PgtHR…
G
Yeah, Dr. Zachary Smith did that all the time with robot on the show Lost in Spa…
ytr_Ugw4cd0u7…
G
AI imagery isn't art. It's the arrangement of pixels anyone will get using the s…
ytc_UgwthxMBn…
G
This exercise was brilliant and one of the best applications of language interac…
ytc_UgwS2X4Gf…
G
It's not changing the a.i. it's just role playing and that can lead to false inf…
ytc_UgyaiTNtu…
G
"Humans Need Not Apply" by CGP Grey was the reality shock I needed to realize th…
ytc_UgyFd4Ey_…
G
ai tools are cool but they get caught fast. what worked for me is GPTHuman AI, t…
ytc_UgzSm2F5a…
G
Some of friends quit other jobs to work in the amazon warehouses because the pay…
ytc_UgzJgkLU-…
Comment
Unlikely robots would want right unless someone programmed (or convinced) them to want rights. It would likely be a conflict between humans about robots. AI would likely be under enough control that they would participate only in so far as humans made them do it.
youtube
AI Moral Status
2017-02-24T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Uggs_IK1YC3UDHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg_8_Uhw0ji5XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgifP2dTRzTkKngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ughp8rMgXWt7YHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjmgwMju9-JOngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggkNC12WrbX83gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjzDCM_zAian3gCoAEC","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugjf7s5S1EE-yngCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjoD7wBGdE4Z3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiaiPzsDcL1e3gCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]