Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know what? I don’t care if it isn’t as effective as just watermarking your a…
ytc_UgwZjgvFY…
G
Because Questions: Do people who use ai suddenly stop being human and having exp…
ytc_UgxLriXOI…
G
I remeber I was following an artist not realizing it was an ai account, I only r…
ytc_UgyspJiE0…
G
The parents I know who called bullshit when I told them about this possibility i…
rdc_k7km257
G
None of these suggestions makes any sense in a free global economy. So many ways…
ytc_Ugx-gs8gd…
G
Well the Amish have made a conscious decision to live without tech and I am goin…
ytc_UgxGhnO9M…
G
When AI gets super intelligent, it will get super emotional too. Not stronger em…
ytc_UgymNCN9i…
G
Ai don’t replace your job you need to be the one who know how to use it to still…
ytc_Ugz_hk3dG…
Comment
You keep saying "program a robot to have emotions", this is a contradiction in itself. Neural networks are not programed so to speak; they are developed on their own. You feed data into a neural network and and then it develops experiences as numerical values. Computers already have the equivalent of emotions they just aren't expressive or reactive like animals are.
If you shoot a deer running for it's life and it doesn't stop, even after being shot, does it not feel pain? of course it feels pain it just has more important things going on (In this case getting away from you).
youtube
AI Moral Status
2017-02-24T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgguZkakQ-aSIHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiVYzcqK51muHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjAwYV7bNsWrngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiM2ON3rg2HuXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjoNG0qXemj1HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggnlcXdFfJjMngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggQEebRU0W_WngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjFW1C80PQDVngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgijAzfQFlDQrngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj3wix4Hs8P1ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]