Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A lot of the disagreement here comes from collapsing very different questions in…
ytc_Ugz1Gf2yG…
G
Everyone has their opinion on this... Mine is this. Ai is here to stay and it wi…
ytc_Ugzqb9lt3…
G
Yet. Give it a few years. AI is advancing by leaps and bounds. It is the futu…
ytr_UgyFbWSan…
G
The robot is about to burn out her circuits. It's frying in that hot seat.. 😂🔥🔥🔥…
ytc_UgxG1iJp2…
G
Fun fact. I don't consider what your poisoning to be real AI. that a large langu…
ytc_Ugy6SAGcC…
G
plagerisim on who? copyright needs a claim not just a blanket argument. if someo…
ytr_Ugw1yMG44…
G
Scientists developed the Atomic bomb. Then warned against it. Now AI develope…
ytc_Ugyoqshcy…
G
He probably tricked the AI into getting that info and plus he could have easily …
ytr_Ugzu1Hy7x…
Comment
to those idiots who are thinking of programming emotions into a toaster: "DONT!"
Emotions evolved to help humans cooperate, care for themselves and force them to do stuff they are not smart enough to figure out themselves. Let's face it, if we felt no pain there would be people who dismember themselves. Since the robots would not need emotions to do self-preserve, cooperate and avoid danger we can safely tell the AI to not to program any emotion into another AI.
youtube
AI Moral Status
2017-02-23T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UggnesQFnwQcjXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh3O8gYOmh4IngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugg_qn7yjcSiXngCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugh11W__arA2kngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgiQjjWj_b99DHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ughveb6eGDShtHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugjpwhkj423haHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgilPTg4mFW0BHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugjvd0kteprn43gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiUhbriGLkzRHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"})