Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I only started drawing almost 3 years ago, but AI art becoming more and more of …
ytc_Ugz-hQ4Dv…
G
No rainbow dressed teachers, no gender confusion, and no litter boxes
That's a …
ytc_UgwCSCxbQ…
G
Can we just send F-16s to unplug all AI data centers? Save the Terminator the tr…
ytc_UgxqhVMbn…
G
Nice cope but AI can handle perspective and depth of field just fine. Any flaws …
ytc_UgwQ1yztI…
G
As a disabled person myself and a professional artist, I really appreciated Phoe…
ytc_UgzIiysD8…
G
Very good interview. Don’t know him that well before, now realized he is a good…
ytc_UgzNlJcsh…
G
I am glad that we can agree on the human aspect being important and not replacea…
ytc_Ugz7Gg_wN…
G
AI can help, but can't logically think observe the mistake like a human, like a …
ytc_Ugy6qTjct…
Comment
"AGI-powered humanoid robots by 2030" LOL! Sorry to burst you bubble, but robots need *power* and battery technology has more or less stagnated (due to basic chemistry) and a humanoid robot might run for a _few minutes_ before it runs out of power. Also, the _actual_ physical dexterity of humanoid robots is utterly abysmal. Far too much completely insane handwavium in this diatribe.
youtube
AI Governance
2025-10-10T14:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwTqBc-rvlgqqAMswh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwinKYDZrP_zFDuLbh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx131qhAOvunwTB5PB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKTR2mTJCDCFcp3tB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdWn2khtfdYkj3yJt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxplo_UczieHTydJWV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx6WRuiulRB4f2bFzJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwERfPT5ky8Tu1q5v94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlKKIJ7hGE_OsWze54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgycJR1PvuXzkQv_8a94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]