Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai is a lie. they are using AI as a lie to fire as many humans as possible so th…
ytc_UgyIEIUAz…
G
Same issue with self-driving cars. The major problem is accountability, combined…
ytc_UgyNl2H9A…
G
I really believe there’s a “crazy Epstein-class” person who wants to build the m…
ytc_UgzhcJvBK…
G
The war against ai has now lead into chemical warfare. And im so for using this …
ytc_UgyLDXBQr…
G
Standards... what standards? According to their little list I should be at 0 att…
ytc_UgyZ0ObF0…
G
oh no, the people producing garbage and slop will have another tool to produce m…
ytc_UgxjzH_lP…
G
You’re hypotheticals don’t work because most people will opt into AI and Robotic…
ytc_UgyPzcOAR…
G
20 years they wanted replace us with Indians, now with AI. These Managers are id…
ytc_UgzoOTvdT…
Comment
The current state of ai dev and the industry as a whole is indicative of a profound refusal to understand the substrate necessary to carry the complexity they’re trying to create and the constraints of the physical systems from which these sorts of complexity tend to emerge. It’s like putting a leaf into a vending machine and getting confused when you get your leaf back instead of the soda you wanted (stripped of some nuance, of course).
youtube
AI Moral Status
2025-11-07T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxn5ipi2RXqS-OCfyN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwVMuNj0Ht7jJHamMN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzfjVjUN5_VvtuQxI94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzrS-iMyGDbBbhq9Wl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxAIPAJip9IRsErZkZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxjzvqUWJicorFntxt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGAGGZIz-5DspMn4J4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5l42IHAIaY-kUg_V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzEqJedDVi3v8AbWw94AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJ4kM03PZcC6RdIrV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]