Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Art community is kinda dumb, by memeing on the piece trying to invalidate th…
ytc_UgwMt9c-7…
G
Speaking as a driving instructor who clocks in 1200km per week, I can't wait unt…
ytc_UgxmYISUB…
G
I believe major companies will try to push AI "movies" or "Shows" once or twice …
ytc_UgxNbxp4V…
G
Most LLM AI's now give disclaimers that they aren't lawyers, doctor, etc, if you…
ytr_Ugw3RUB16…
G
Destroy, destroy destroy. I don't think it'll be that easy to destroy all humans…
ytc_Ugx8APfoG…
G
The idea is that AI is a tool that allows you to do something, "just like a whee…
ytc_UgxW7zn_8…
G
so the way the mechahitler incident is covered in this video is really, really i…
ytc_UgzMPPR-7…
G
⸻
This was both unsettling and important.
The fact that people are developing …
ytc_UgyZ7L5cF…
Comment
Unplugging it wouldn't be murder any more than knocking someone unconscious would be, because when you unplug something or turn it off, you can still plug it back in or turn it back on, and it'll continue to function just fine. Murder would be destroying an object to the point of it no longer being salvageable-- in other words, a robot can only be considered "dead" if it is incapable of any sort of function at all, and there is literally nothing you can do to fix it.
youtube
AI Moral Status
2017-10-03T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzPEGu4HGHNUfKVL5p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXDHgqGs3BAdW7QV94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWM3z1SFDAhvfggJx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRnz8y6arWUxgk3pV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyRqu7OqGqzkVLvCP14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAL9THQl5YGNvKej94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwxHLorRKIR9x98dfV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz3SDO2ms_3YSL_DbJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzOkWxCifF6GgfExbt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwzWE98yeVe5AJptm54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]