Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI it is now the future,but like you say high unemployment in young people but a…
ytc_Ugz0aouEa…
G
But the AI user can notice the baby crying in the background which AI won't even…
ytc_UgzVqv7Ep…
G
Please explain the massive data systems that run AI. I do not understand how man…
ytc_Ugy05cByp…
G
Are you really that naive to think that a conscious being like an embodied AI w…
ytr_Ugyud0ryN…
G
It is supposed to be a robot pretending to be a real girl But it is a real girl…
ytc_UgytTWJsA…
G
She is smarter than you definitely and she does know what he is talking about. Y…
ytr_UgzwGkjOT…
G
that is a good thing. it is like when the women could work too cutting the workf…
ytc_Ugwd4UxMz…
G
And there are many videos on YouTube showing AI confessing that there's AT LEAST…
ytr_UgyQKsNRB…
Comment
So, most things that talk about AI personhood present turning the machine off or unplugging it as equivalent to murder. But, assuming the hard drive isn't wiped in the process, surely it's more analogous to a medically induced coma? After all, you can bring them out of that state and they'll be just fine once they've finished rebooting.
Don't get me wrong, it's still very much not cool to do to someone without their consent (with the *possible* exception of a life-or-death situation, and even that's subject to a lot of debate), but not quite as irrevocable as murder. Murder would be taking them to an e-recycler to have their hard drives wiped before being torn apart for scrap.
youtube
AI Moral Status
2020-08-28T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxMCKqDtgnNfcH5bhN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwYR9O-VGQOcl-5i-p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUNUaUyRx3aIlKu1V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7rZKvgKFGyggcJCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxqx8su6wktyNKm1Ad4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-n-E8LayJ88lzLLZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8d1lOlGg65sIDVuh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvpBSM5VZTNVScXLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxq0LqphmBYNKFlsDd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgygZF5E3ttKUqK02ul4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]