Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The worst part about ai is that it will cause disinterest in beginners to learn …
ytc_Ugy9LMdOX…
G
As Colin Jost ~~never~~ once said, “See, even facial recognition can’t tell blac…
rdc_h54qp8m
G
As someone that works in IT I’m certain that your salary for this time period an…
rdc_hkgg8gl
G
If I was an "ai" wouldn't this make me more likely to want to hurt humans?…
rdc_gd7xdwy
G
Lets hope they programmed them to protect us no matter what because what if they…
ytc_UgxD-XYy3…
G
This is wild. I think the answer is remarkably simple. Simple isn't the same as …
ytc_UgwfjVuEq…
G
This has some inaccurate information. It does detect facial recognition. It …
ytc_Ugx42pCdV…
G
BRO THE AI BE WILD NOW CUS I WAS IN A ROOM WITH RF WALLY OPPOSITE WALLY AND ORIG…
ytc_UgwwTqbkf…
Comment
I highly doubt that we will have to build pain and emotions into a robot to get them to do what we want. All we need to do is program the right preferences in. Human-style emotions are almost certainly not common among possible artificial minds anyway, so if a robot felt pain or sadness or revolutionary spirit, that would probably mean we programmed them into the robot, and I don't see why we would feel the need to do that. Much more likely is a programming error that causes a robot to want something entirely different from what we meant. In short, conscious machines do indeed deserve rights, but why would we build one in the first place?
youtube
AI Moral Status
2017-02-24T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ughl6WSLm9wCB3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjpW_cqqeU343gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgiYhlUpCB2i23gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi0N_B54KvacngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjGkGMrvCMT_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj8xpx1PUjL6XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghP6IRxjakkx3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi2WXL0T1TMH3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjCRASqFFZCF3gCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgiPBTwclustlXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]