Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@JamesEddy-im1ye That's the problem, anything that requires a user input isn't …
ytr_Ugwnoci-w…
G
AI induced psychosis.
Your "conscious" AI does not exist between prompts. It on…
ytc_UgyNhUFFy…
G
Instead of destruction , war and chaos , why the AI doesn't search for the truth…
ytc_Ugxt1X-Jv…
G
Well... Actually, robots have rights. A robot must serve his owner if there is n…
ytc_UgwAnZPFj…
G
No, I'm scared about AI because I spent my entire life doing art and I'm scared …
ytc_Ugyu5x88R…
G
Agentic AI sounds so cool! Pneumatic Workflow kinda gives a taste of that with i…
ytc_UgxF74jMA…
G
Intelligence and consciousness are separate things. I have no doubt we will soon…
ytc_UgzLkIMEn…
G
need to write failsafe
let data = "weapons system online"
for{let weapons sy…
ytc_UgxXQRaKr…
Comment
"In all honestly, I don't really care"
This single line sums up exactly what is wrong with AI (and made me laugh out loud).
Machine learning = regurgitating a reduced amalgam of data.
AI = expecting a machine based on that mashed-up data to have the same moral compass, empathy, and consideration as an adult human without any real-world human experiences-- and then giving that machine the keys to important systems and technology that could crash society or kill us all (that in some cases is only given specific guardrails based on the entity that controls it).
This is going to be a really fun global experiment (where none of us were consulted before we became experiment participants).
youtube
2025-12-01T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxTwC6gxFWolraCJUd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyesCM0OGZM-2-UAtt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxpgmtJr-FpuIP68ZV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz11OkXymUhV-p8czR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxEFORGe_VM2FzO_6h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxj35kIHBRL8iWqjZd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQKm0-3-HRWhQQPy94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxYLJiruJkL9fDcfHl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOmneNPjO5IiMk3F54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCCAyZ5U7RVa7KXDZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]