Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Watch the video, "NSA, Mk Ultra, schizophrenia, and the alien connection". An a…
ytc_Ugyfkyu4b…
G
A number of things went wrong here but from greatest to least: victim was crossi…
ytc_Ugw6mepUZ…
G
We were just informed at my job, (I’m in the healthcare industry) that they will…
ytc_Ugz3_Aydy…
G
I feel mean being rude to a chatbot, so I always use please and thankyou lmao…
ytc_Ugz0FNXTt…
G
The only uses I can see for AI "art" is to be used for template images or to cre…
ytc_UgzrVY6KN…
G
I always wonder what these guys would think of the book series Scythe—where AI h…
ytc_UgySDD94W…
G
If anyone were to ever run my art through an ai model, they'd never see the ligh…
ytc_UgypzVEgQ…
G
Of AI succeeds in the way Ai companies want, it will collapse the philosophical …
ytc_UgyXhIz_B…
Comment
Ethical responsibility hinges on the idea of suffering. If AI comes to a point where it suffers, or we have a good reason to believe it suffers, then we will have ethical responsibility toward that AI. Does that equate to "humans rights"? I'm not sure. But it means that we have a responsibility to not treat it like shit.
youtube
AI Moral Status
2017-02-24T01:2…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjntJCtmsi_wHgCoAEC.8PLObkpQrKn8PLZVGyKF5j","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgihLx0jf6a3WXgCoAEC.8PLNjdgEASS8PLYIUu16Gc","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgihLx0jf6a3WXgCoAEC.8PLNjdgEASS8PLcMpgbgnK","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghCaM6d7GAEr3gCoAEC.8PLN7YbI0QS8PLhYxG5hFS","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UggLshsEzXkadHgCoAEC.8PLMgF14Tpo8PLSzKcKCE_","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UghOBOAmhtRLmngCoAEC.8PLMcGRhI7t8PLRQ7AgxGX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UghOBOAmhtRLmngCoAEC.8PLMcGRhI7t8PLUo1NKx7h","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UggnjeyPzPMAnHgCoAEC.8PLLgi-1mHJ8PLc7OrvKsT","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UggWtTsvmDUhMHgCoAEC.8PLLSmpTGIb8PLMBFIeFTO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgiORVzs3ZYA-XgCoAEC.8PLKxT91UgG8PLQaTYn5Ai","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]