Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Alex: You sound like Jordan Peterson.
ChatGPT: Let me be straight forward.
Cha…
ytc_UgwHdiuyT…
G
Uhm… it is used for that also!? 😂
just keep drawing, everything will work out, …
ytr_UgyM4QOys…
G
I deleted my chatgpt and it haunts me in every new instance across new accounts …
ytc_Ugy98CK19…
G
@justsavv6378 I'm just saying that we can't just pretend it doesn't exist and …
ytr_UgyIoFeAD…
G
We appreciate your perspective on the matter. It's true that artificial intellig…
ytr_UgwcGYSKA…
G
Bro ngl kinda understand the hate from both sides of this equation and artists h…
ytc_UgwjqvwCz…
G
I see two possible reasons why a self-driving car would have a pre-programmed de…
ytc_UgjQFdEz8…
G
You left out the part where the people revolt and destroy the ultra pharaoh like…
ytc_Ugwp8Yteb…
Comment
Why do we need robots? Are we just too lazy to do things for our selves? Robots in factories are just programmed to do a particular job. They are bolted to the floor so don’t represent too much of a threat. The guy in the hat is almost the archetype of the crazy scientist. Perhaps this is deliberate, but I think it failed. Judging by the reaction to the top comment most people are not at all keen on the idea of having lots of robots around. If we want to benefit from AI, why do we have to have them look like a human and have the ability to walk around. What is really terrifying is the prospect of military robots. Apparently these have already been developed. If the are given autonomy to decide why to kill, we (humanity) are in big trouble. I hope humanity will reject this technology.
youtube
AI Moral Status
2023-02-27T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzFWqB4f5gRaYAPm3p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx5DrV3IGcOq6uE0wh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxflvUW9y-vTCjv0xx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgY9WsEXpK8k2bceF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyAv7YKaDo-Xv8bqWh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlMTNITbgksHdl_Xl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzQIeN_wQ1dr2YdHzp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzsLvLoV3SLtidN8_x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxnJSbYJSstKLGn9Kd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxhvkLhvqCdM1JM0v14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]