Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I quit teaching because of it. All my colleagues were doing to copying Google sl…
rdc_nsg5305
G
This is why embodying AI is dangerous. If we put it into metal and allow it to p…
ytc_UgxUCotDw…
G
Damn yall do us like that will guess what ai at least I am a real person and you…
ytc_UgwdPk1eq…
G
I'm an actual artist and don't think I'm the greatest but I think my art has far…
ytc_UgyZ78zs6…
G
As a retiree that was a workaholic, once retired I now dance everyday, go to the…
ytc_UgzAirvco…
G
"Calling someone an A.I. artist is like calling someone a chef for putting a rea…
ytc_UgwtuGJsX…
G
Is Roman saying super AI will make decisions in a nanosecond? Meaning it will th…
ytc_UgzMBU_H6…
G
ai as a concept is inherently harmful depending on the concept (ai for accessbil…
ytr_UgzJVOa3E…
Comment
UBI isn't a panacea, but without something practical like it, society will collapse. 7 years ago, research was showing that 1/3 of workers will be displaced by AI and automation by 2030, and the estimates just keep getting more grim. If the 1% use AI to leave most of us behind, expect collapse and total chaos. There are too many compounding factors like climate change, tensions between and within countries, corruption, etc. Things are about to get very bad. Yang may have been wrong about some things, but he was right about this. We need to act, and we need to act fast.
youtube
AI Jobs
2025-11-30T00:1…
♥ 16
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzGY5TmluyQiJ4_LgN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzv4jLTeBfNtvYi_6F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBft0IWs8AJFvBGcp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbPOCZQJw11hL8ppd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzbBiAck0bLnAwsDI54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwsyv8iZcifQ4qcrit4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXu-aVmMKV-HavbzV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy3imUCVPo3dTYQiFZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw7PAUUUPm0JAMoHMV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwLhL6uv6BThDR8Wmx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]