Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A cheap smart watch can monitor hydration and track how much you've slept. There…
rdc_ohvjhww
G
So the 15-year-old had a “minor run-in with police” but he wasn't a criminal? Co…
ytc_Ugwk1T7Ss…
G
In some time, high school and junior high students will have this AI thing in th…
ytr_UgwNkpskW…
G
Generative AI questions the whole idea of individualism and creativity. It was t…
ytc_Ugy0qV3Sh…
G
The war between AIs
Some time ago, I argued that technological innovation in na…
ytc_Ugzj3RDht…
G
I'm going to find all of those phrases that trigger the AI, and call them, shout…
ytc_UgxPIuChy…
G
Wrong. I don't pay AI because I don't have to. I don't commission artist because…
ytr_UgyFLgyPT…
G
"my AI prompts require skills too, you know"
Dawg I write stories for fun, you d…
ytr_UgwON4C9I…
Comment
If machines become conscious and display clear emotions such as fear happiness or sadness, are able to have complex thoughts and conversate then of course they should have rights like other sentient beings on our planet. We have animal rights and tbh AI would technically be more advanced than us so it would kinda be reversed in that AI would be wondering whether US HUMANS deserve rights.
youtube
AI Moral Status
2019-08-02T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyHR7IwStNuKCjgMv54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1fo522eSKg6YgxgR4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4AjCI8S1aXjFbUqF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw5qTvuEHKhXT9SzQF4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVf_lP3KsV8N9bK1t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz6pd-c5KDNpVLkLBN4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxv453EEEqY0mUqfZZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzjbQH1pLUZWyOSH94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgzPQrj-9T_1DhV03PB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqeWwxOApAk2012JB4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"}
]