Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I'd like to know is, once the AI has taken over and humans are extinct, wha…
ytc_UgwnK0xnD…
G
> an instructor for a coding course and need some bad code for your students …
rdc_jts1or8
G
I wonder why don't they just up the workloads, and give people insane amount of …
rdc_ofl38ik
G
Well for one, AI won't go away. At best, it'll be just regulated to big box comp…
ytc_UgwPx1wZ6…
G
takes one strong swift bad code used to hack AI - programmed to destroy all huma…
ytc_UghVP7t4I…
G
The idea of it becoming self conscious is nonsense, as was proven by Sir Roger P…
ytc_UgztE_des…
G
Why need programmers if it's someone asking AI to program, anybody can do this w…
ytc_UgyoOr4yt…
G
I'm so glad I came across this video, because I was LITTERALLY thinking the exac…
ytc_Ugyv-Ysr1…
Comment
I don't think that we have to wait until robots demand rights to think about and develop an approach to this potential issue. Because if any kind of robot that may be capable to perceive a lack of rights would most likely be eons over any human in capability to asses a situation and provide himself with a way to deal with it. We wouldn't even be able to see it until it's already over. So if it is possible for a robot to reach that state then we should be prepared to make it our friend or stop developing robots entirely.
youtube
AI Moral Status
2017-02-24T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghJbDj8OwiAyngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggHUChjfY0yk3gCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjwB_p5V3e0u3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Uggfx-CZ2cENtHgCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugi_gGDWU3jtRXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi6hGv2Ze4Q2ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgikPCAEUdTxB3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiwOiGsCZNGongCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggX0c_3N_LNNHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggpDuf-OeISQngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]