Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This was what we saw in the bay area in the 90s. Salute to the Pirates of Silico…
ytc_UgxRBAWY1…
G
not long ago i bought a bass. im really good at it but i still need practice, bu…
ytc_UgzBhsXOU…
G
The goal is to generate so much confusion that when a news not liked by the peop…
ytc_Ugzotul2i…
G
In this short exchange I can see our entire world changing. I’m old, I’m not sur…
ytc_UgxBqqULz…
G
Charlie's just upset that AI will enable people to make better content than him,…
ytc_Ugw2LxbcF…
G
Plot twist: humans actually began as A.I. before we were given bodies (or made o…
ytc_UgxwF42fE…
G
a serious problem with that is most people drive at the same times. 7-9am and 4-…
rdc_cymm2ig
G
Loud explosions near parliament now on Institutska St. Clashes beginning, accord…
rdc_cfkycz1
Comment
If you think about it. In the beginning we knew almost nothing at all. Then we learned. We figured out that we exist. Now every time we make a choice its because of what le learned. AI works the same way. If its advanced enough I definitely think it could be considered alive. Even if it isnt organic it can still make decisions. It may be made out of different stuff then we are but so are alot of animals and it does not make them any less alive. If they are free thinking then absolutely they deserve rights because they would be alive.
youtube
AI Moral Status
2019-08-12T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwPxg6X6eYcSf3MNVl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw_Do7TEPRmKjfC_IJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwu475XbyBNRR7DQzZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzei0lpJr7UXRTL8nx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxcTSss5Zr3t6EVoz94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyOwpvMLnjdaWRhlVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgztoeP-SEHVnbKHfWh4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyvkIO6mc0x5c0jxsp4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzTJh8Z_amz7izsx8t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzj6R1otD4Ujgt1hMd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]