Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Assume Very clever people will apply AI to most pressing problems:
- The enviro…
ytc_UgzjHhZP_…
G
Hinton’s hidden grief is not just that we might lose control of AI.
It’s that we…
ytc_UgzRtJ-z9…
G
I don't need AI to edit videos for me... My editing style is a part of my craft.…
ytc_UgxjVWhFj…
G
I dont understand why they are so obsessed with making "AI" that replaces human …
ytc_UgxoC4IXf…
G
These CEOs just want to be Emperors of their empires of Slave AI working armies …
ytc_UgzNy8gu-…
G
Lmao facts, auntie Jackie taught me how to block people for the slightest reason…
rdc_o6dehne
G
Good lord. Yes they are. I’m not surprised by this. How is it being received? I …
ytr_UgwwEkrTU…
G
@gamecashers2472 stealing? Why don't artists actually draw good art instead of …
ytr_UgwB2npAk…
Comment
Governments will tax corps that replace humans with A.I., to give us UBI, which will lower everyone’s wages to catastrophic levels and we’d all be subjects of the government. We’d then have to use our UBI to give corps money (or they’d take it for all the subscriptions we’d have to have for things like food, water, elec, inet, etc.) and it would just be a game of corps vs government and human lives as nothing but pawns. Way worse than we have now.
youtube
Cross-Cultural
2025-09-30T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw7OVaf4LHfsVltt014AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy-E4csypqu8TGFnLN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyq0MzFBwWyEivYp1J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxgLeG1mYTrwrAzlnd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxmdjXKr2jaiaf9_wt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_tJsETigEC9T8L9Z4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyWbw0C6fu6YGwUvZ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw1tIGPA_afMPoOTMR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugya1EU2ZJbxFIofJKV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTagzQyAmjy38pGBZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]