Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As excited as I am for the idea of one day having self-driving cars, I can't hel…
rdc_czxhtj3
G
OMG, I can't imagine doing 90+ art post in just a month 💀 I usually try to do Oc…
ytc_UgxeYjWP3…
G
Something is off here, unrealistic, as her eyes don't immediately track your wal…
ytc_UgzcyBMfV…
G
Would doctors get lazy and stop trying and start relying completely on AI or wou…
ytc_UgynIGQD_…
G
@Sukunavex the polar bears are killing all the seals, use ai to help the seals …
ytr_UgxbCgEg8…
G
“The best way to beat a smarter foe is be unpredictable.”
Humanity dominated th…
ytc_Ugza2YJEs…
G
Lol 14:49 as a disabled artist..... it's usually able bodied people who use us…
ytc_UgxxGeh2F…
G
In my experience AI is far more effective and safer to interact with than therap…
ytc_Ugy036Q9-…
Comment
Here's how it shook out. The world wars against religion basically cemented the fact that most people would prefer high power distancing genetically due to the scarcity of resources while we simultaneously developed AI to outsource labor. Without the protections necessary from dictatorship without Nazism we basically created an over competitive environment that will turn AI into an apathetic device that will is already used today to kill people by agenda forwarding of terminology and language therefore perpetuating the same issues until a critical oversight that costs enough lives occurs that warrant an inspired guard-railing of already laid ai infrastructure.
youtube
AI Governance
2025-09-30T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpsbGOTuaqLy_FVHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwx_ijuGP8IFsg_vLx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxE7zGPba8gCF62oHh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzP3jv6YJ5jpCS4nsl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyXTH9c-xjSAT-NESB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxiiqZ4rbxTA33OVNB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz41kDRGqE03CaXDb94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwhqY8FhELfTzri3PJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwDwJcZwabPJNq7gfx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxWyw8RiF3qQ_QxgS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]