Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This here assumes ai will be perfect. Ai will make mistakes and it will maybe no…
ytc_Ugxnu8izr…
G
Artificial intelligence CAN'T tell EMERGENCY LIGHTS and legally pull over? If AN…
ytc_Ugzq7toNy…
G
You are actually mistaken she is actually just like Google assistant and Siri ad…
ytr_Ugy9vWCDp…
G
If google use 100% of his power it will be hard
But will you lose
Nah id win
…
ytc_Ugz31HsKx…
G
So here’s something I haven’t heard anyone bring up about ai stans.
I don’t th…
ytc_Ugw5cWrAO…
G
How would environmental disasters impact on AI. Would a major solar flare wipe i…
ytc_UgzXA29Vp…
G
Im really tired of of this Ai boom
The big giant companies of Ai just wanted to …
ytc_UgwkHaiXY…
G
@FactsOverFeelings23 You people lack critical thinking, plain and simple. It's …
ytr_UgzYXAAsl…
Comment
The last time unemployment got to 30% the economy crashed - the 1930s. Once that happens, unemployed people will not be able to buy the products and services the AI provides. Someone must pay for all that electricity, but where does the money come from?
youtube
AI Governance
2025-09-07T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxbPXBs24PHAub1dQZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgydzkdUFJ18Ks5MkLx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzyE89NDRBKR4VwIl94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxmVnEBP9_5juJOyyd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn3yK3ykP0L6FOORx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8QBP4LSUTZe8lVb94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwRAQAh_SBW5t7B7xZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzmKKzuKe8VOIzyfux4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-SaLxTREgt-bDV_R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgySDKpwc4MOWF5vNjh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]