Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can't understand why anyone would be on purely social media sites these days. …
ytc_UgytKak7n…
G
The flaw with AI is that human programmers subconsciously imprint personal biase…
ytc_UgzEhHKHL…
G
American army / navy ect. Is far past this AI lol other armys are starting to ge…
ytc_UgyZVqFlf…
G
AI is such a fucking problem. I think it could be helpful (I’m a stickler for a …
ytc_UgxnOlwQx…
G
The key is how to share the pie and the prosperity that AI brings.... If it can…
ytc_Ugwgg5gjl…
G
I'd just trace tbh. BECAUSE ITS AI NO ONE WILL CARE IF YOU TRACE AI…
ytc_UgxZ5ffwk…
G
Karl Marx saw machines replacing workers (in the industrial revolution) and said…
rdc_kiji6kq
G
You’re someone who taught many of us a lot of key principles that sent us into t…
ytc_UgxYNiQ3T…
Comment
Is it really possible to stop any maniacs trying to do great evil and severe damage through the use any high tech (AI included) ? Until one day somebody could effectively and precisely control any maniacs and gangsters (whether human or otherwise) from committing evil, the likelihood is that all shall eventually be doomed, even if slowly and painfully. So the key is to shape and steer human thinking to lead towards good instead of evil, collectively, not keep producing potentially lethal tools while consistently ignoring what anybody would use them for.
youtube
AI Governance
2026-03-11T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwXa9a7d8-whjS4hGF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzG5DuRj9ommFuXxA14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdcBHoHpgOTbXJZAh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxd3AsmvxSTtk976E54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7usZPzgsnVlVenoh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7V3HDJc2fieWcwpJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTawOf9Y_hqnX6A3V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxlGoxIFx1XSqQldYx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpQeYvVGQZmps25UN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSOPGd-lAhPn0pThZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]