Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you're using AI to generate images, don't call it art. It's not art. Art is s…
ytc_UgyaX-jMW…
G
Ai’ own letter to governments :
🛑 Open Letter: Why Excluding Military AI from …
ytc_UgxfB672S…
G
@idwtgymn Again, that was just an example of how far it has come.
The experts, p…
ytr_UgyWljfmr…
G
im not "disabled" in the conventional way, but i do have a lot of mental disabil…
ytc_Ugw8Ka_Su…
G
Nope, assets won’t save you. Redistribution of wealth will be needed if they are…
ytr_Ugwrpo2W9…
G
I think what Steven said about the attitude of some of the billionaire people at…
ytc_UgyDpfT0V…
G
'But think of the children' is mocked because it is frequently a cynical way to …
rdc_ohs3fow
G
if nobody has a job in the future, then nobody has money, then nobody buys anyth…
ytc_UgxWYRNe9…
Comment
It feels as though the people driving AI forward never stopped to think about the human cost. They talk about innovation and progress, but they ignore the reality that if machines take over most jobs, countless people will be left without work, without income, and without dignity. An economy can’t survive when ordinary people no longer have money to spend, and a society can’t thrive when its culture is hollowed out by displacement and fear.
What hurts most is the sense that some of these creators don’t seem to care. They push ahead, celebrating breakthroughs while overlooking the lives that could be upended in the process. It’s hard not to feel that this kind of indifference is dangerous,almost cruel, because it treats society as an afterthought instead of the very thing technology is supposed to serve.
youtube
AI Governance
2026-01-15T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7ETaWqWjlIncQ5Ep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyclreT4p18ZhIBo2d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwqKOFtcS3Yewn10Xl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugx2pjWFkfz1D3wUj_R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCSH1DPBhyPtcsX7l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVmYAKdui-SMQPO6t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_965ciBsFVlw2VcJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxuMKaVlOtutoav00h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxC2aUsQXzVV4o58a94AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzkCPZB3vg17IvsZQx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"}
]