Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I already insist that AI not referee to "itself" in subjective terms before answ…
ytc_UgwcvFBIu…
G
Had one and gave up when it couldn't recognize my face to start. I have plenty o…
rdc_n6snbwm
G
Only use i see this actually being useful in is video game AI. tactics, language…
ytc_Ugwz1XG_S…
G
Could he have had his foot UNDER the brake peddle?
The regenerative brakes are…
ytr_UgwCS1w9S…
G
The AI can keep their economy going and we can go die in a war.…
rdc_ncl5vba
G
Haha, it does seem like Sophia was really engaged in the conversation! Her respo…
ytr_UgxSPQUB3…
G
These comments are totally retarded.
For those of you too lazy to watch-- Mach…
ytc_Ugii0UZzi…
G
The title of the video is totally stupid. Ok you want the working class to surv…
ytc_UgyKfitdZ…
Comment
Making robots sentient is the stupidest thing we could do, especially since its entirely up to us. Just don’t program them to feel pain and the like.
We’re already using them for hard labor now. If you make them conscious, we will need to give them rights which would hurt our productivity. That is the sweet spot for AI sophistication. Complex enough to do advanced labor, but not complex enough to be sentient and therefore deserving of rights.
youtube
AI Moral Status
2018-04-25T16:5…
♥ 245
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzHeP5PuF__paoccSV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZllfuTlv47zYMWPN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwohNn0l0WX4TmzveF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzC_pylmq5gOFFIunl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxmjLPpZesA_tUGjpl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyorlZZ8G8SoYY8hCB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyQbO8ZCx5Bk7mVcGh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxDqx5ZEsC8233V4bR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQHHfA9qCXPW0wg-94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEZ1V5gxmaROZCJoF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]