Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's going to be just like westworld tv show they have to show us what there pla…
ytc_UgxXM6udQ…
G
Watching the butthurt reactions of artists because of AI art has been one of the…
ytc_UgzMwQJ08…
G
"Why make art if u can use ai"
Bcuz its my escape from reality. Its my comfort. …
ytc_UgzZAuHxx…
G
When you put AI in a Capitalist Society, the highest priority for development wo…
ytc_Ugzdj0Usb…
G
0:18 Better code! Have you seen the mess it's made of MS Word's Shapes and draw…
ytc_UgwL-oqiL…
G
Ai 'art' genuinely makes me feel a negative emotion that I cannot express other …
ytc_UgzqdEEJa…
G
There was a BBC TV tech show called 'Click' a few years ago. They did a report o…
ytc_UgxLPQndL…
G
wouldnt raw supply also trend to 0 as well? Assuming that farms are mostly autom…
ytc_UgzhhwiMh…
Comment
Neil, I agree — science gave us some of the greatest breakthroughs in history. But after we solved the basics — clean water, antibiotics, emergency medicine — the focus shifted. Technology, big pharma, and the scientific machine started prioritizing profit over purpose. If we’d frozen technology and medicine at the 1980s level and focused purely on making those tools cleaner, safer, and more equitable, we’d likely have lower chronic illness, higher mental resilience, better community ties, fewer addictions, and a slower, more grounded pace of life. The potential of science is still incredible… but the direction it’s been steered in has too often moved us away from a truly healthier, happier human experience.
youtube
AI Moral Status
2025-08-11T23:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwPz9ygWheepYrI5894AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxuxgtj9e9x7vcrBi94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxejfLC2zM6RmEhgG54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxJJvnngYkWlGzFLUd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxgBUrTpJ764j8G8CV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy8QcdoXyKxj20YAFR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgztAxPqB9JLunLtlm94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwoeGnkvMaNMA5Xhg14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztlpjZFpEjSgwLnF54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcSMkd4Ww9c1ZqFT94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"})