Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Pluh you know AI images are poisoning themselves and it seems you’re a fan of on…
ytr_UgwKM6lrL…
G
Humans need purpose and meaning. I assume the power elites will be replacing us …
ytc_Ugx3avlY3…
G
They do exist just not legal per say but if the government wanted to allow it th…
ytr_Ugy5KlxOE…
G
That feel when AI bruhs don't actually understand how image diffusion works <w>.…
ytc_UgwGBwV5B…
G
Can you provide a link to the FOSS Image LLM you were talking about? Sounds amaz…
ytc_UgwSIJ9BQ…
G
It's a pedagogic problem: "Do as I say, not do as I do." doesn't work with child…
ytc_UgzCvv4NZ…
G
They will use it for all nd new types of scam. What irritates me is that all med…
ytr_UgxRPo6mq…
G
in my dealership we have gotten reports of amazon employees getting fired due to…
ytc_Ugz5VEZ6w…
Comment
The main reason is because there’s no way to end development at this point. That answer is simple and likely “best” but unrealistic.
The only way to stop AI development is if collectively the entire planet agrees not to, and that’s not going to happen because there are enough people who want the money and power that might come from developing/owning the models. Not only that but there are plenty of people who believe the long term good of AI will outweigh the negatives.
Thus all we can do is focus on solving the negatives rather than eliminating the tech entirely.
youtube
AI Responsibility
2025-12-20T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgwUUU0FJr1qb7YP1-l4AaABAg.AR2_VjTelVKARB0v3DUPdn","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzG7dbDUeGOQZdHAJV4AaABAg.AQppNN1Uj-BATym_qs_cHa","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxEI3fKyCOLXnd-3a14AaABAg.AQ2GcB6PKP9AQ2HdgSDy6r","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwGMvDU00_X8Tfk2794AaABAg.AOrqT_6G3_JAQyX4AIA5aF","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzLD8dm2UO2ax5PMUp4AaABAg.ALFzRlAhKL6AOBv9u3DbGX","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzLD8dm2UO2ax5PMUp4AaABAg.ALFzRlAhKL6AOCSQoU3yC8","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugyu9-hAWphi7g35oUR4AaABAg.AKFoAwqTAFQATa7uPmgfVy","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgwxULmQdzOA0lqwB9B4AaABAg.AIth_F0MizHALYmIcI-uiS","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyxYCyu1kR3N4-Hip94AaABAg.AIli_xiOogkAJ6kwRXkQ2B","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxx1QxRAsLE9FI4mkt4AaABAg.AI0HEwE5S0xAIRPWpURcFA","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]