Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Disabled artist here. AI art may be easier to create, but that's because you're …
ytc_UgwZiUNVm…
G
Obviously it is sad that a life was lost, but with that said some animals do dar…
ytc_UgyMsJ2H_…
G
Nah, not going to happen. I use the latest, most cutting edge AI for engineering…
ytc_Ugx8_rmMu…
G
I'm a software dev and I'm still puzzled by the claims that AI is outright repla…
ytc_UgzHDChe7…
G
The killing part won’t be the AI but the oligarchs when it is time to pull the s…
rdc_nck5eay
G
The speaker is like cool more robots means I can’t smoke pot and not do nothing.…
ytc_UgxiaIXtr…
G
From what I've heard, the ai bros are saying that some method that I'm pretty su…
ytc_UgyNBVqPG…
G
who's gonna put your drink on the robot if you live alone? that means you have t…
ytc_Ugxedloc_…
Comment
I see no problem here. I don't care about animals cause they can't think.
As long as something can learn and do stuff based on its previous experience it's alive and deserves rights.
But the problem about meaning of rights is interesting. Even if robot can think and learn it probably would not be able to suffer so... Being a slave would be ok for it. Hah.
And... Unplugging a toster isn't killing. You can easily plugg it in again. Its memory, expirience, minds are completely safe. The problem with humans is our bodies start to rot if they stop operating. As I know, brain is alive only for 2 minutes if the body is dead already.
youtube
AI Moral Status
2020-07-28T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxWl51xd66j3p-3hs54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy5ZPH15izuvOSYYYB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"sadness"},
{"id":"ytc_Ugw2v4yU19slb-zXDqt4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxULQRXtDBc_CnGqNJ4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxIMp0mytvW_5Xnr-V4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzT2gFdtbIGZVs4xF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyDLUjEeYcUG5ildLh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxV3EVVQP7Ej0nA0Np4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxvv_uYMZbCQIdNG7h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugys1h8J4WxOiIjihHd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]