Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So tech bro's replace humans with AI robots because its cheaper.. The obvious qu…
ytc_UgwK5QjXt…
G
Exxon is the worst of the worst. They reject climate science and hire PR to spre…
rdc_d0fdd86
G
Since AI is likely reading this as I type, I hope it understands that I welcome …
ytc_Ugx25MEcG…
G
I wonder whether p-doom, the probability that AI will somehow end humanity, is s…
ytc_UgxQp-Ck-…
G
Just make the AI *arrive* at the conclusion. And then display that conclusion in…
rdc_gd7bez8
G
@FireFalcon0art is not about the end product, ai crap could be insanely good and…
ytr_Ugw9Hrv00…
G
Lets assumed ai chip is not embedded in the human brain in 100 years from now. A…
ytc_UgyYzYiRH…
G
I genuinely want to see someone hack the police ai program and make the program …
ytc_Ugy3AjEg5…
Comment
All those involved in the realm of creating different systems for A.I. are now suddenly very concerned? Where was ANY forethought while developing such? As if they’re actually surprised this wouldn’t happen? It’s a mockery. The genie is out of the bottle…and he won’t willingly go back in.
youtube
AI Moral Status
2025-08-25T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgyOHEvjahHIEdYR07F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEcr3ybqxgZRbNbHF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVaO2MxcnJ4pzinn14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxkVst3QxbMKRhKqXt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwSiZfA7vLUeSKoxkh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRvZkSCZiEACwxwCZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJVnVSOo5iUOJQREJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvPsegSytXN4d1rnl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzg5Hdxje-Tx7vPyLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnD7MNqaHC4j6SGoB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}]