Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly that's tough . I hate AI , but what these modern art galeries by allowi…
ytc_UgzJAf_Yc…
G
I think people will catch on that AI art has no value because it's made without …
ytr_Ugz3gP3un…
G
Karen is the expert, why did you stop listening. Some ChatGPT I have cancelled a…
ytc_UgzRskkCi…
G
I am skeptical that AI code is any good, but I suspect that AI will be extremely…
ytc_UgxFbsjWn…
G
In my experience with all sorts of AI chatbots, the 'reasoning' feature is usele…
ytc_UgzuGab3d…
G
Well without AI the economy will eventually recover. With AI, you're just count…
ytr_UgyYlJxCG…
G
Ai intelligent come from human intelligent.We want natural world that we feel 5 …
ytc_UgwQ9m6_6…
G
This is not an AI problem; it comes from the industry. As long as we continue to…
ytc_Ugw7NRSKR…
Comment
I think that AIs should have rights based on how conscious they are. Also, we should treat the human-like AIs in a way that resembles our definitions of ethics, because if someone ever programmed such an AI, it would most likely think as we humans do, but we should punish them differently and accordingly to their machine nature (on which I have to say: I don't think that unplugging an AI is murder, just potential loss of memory, though that would still make it a commitable crime).
youtube
AI Moral Status
2017-02-23T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugj9uA4E2qdNfHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjnOffaiIS5qHgCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugj98md7zFOMrHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgitNrH9VLI5X3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Uggkdf3AcQC3ZHgCoAEC","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggqumG_AwEw_ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggJ3-NtmsdA4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UghacdqQa_8JXXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UggZ2aPEfECZoXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiBCDn6kZ0PaHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]