Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i would set up ai at home and home school my kid instead cause schools have fai…
ytc_UgxwajkCR…
G
honestly? i don't even have a problem with most labor jobs being done by machine…
ytc_UgxQHWgu_…
G
Glad I have a job that could never be taken over by machines. AI will never be a…
ytc_UgwFiCDuK…
G
@Mtmonaghan Plain fact, whatever you or I think, and we are all entitled to …
ytr_Ugxbq1iDa…
G
They will be spying on you through the robot just like your phone tv computer th…
ytc_UgySYA35g…
G
*I took one look at this man and I can automatically tell you that he’s telling …
ytc_UgzF7iUgM…
G
In order to solve our own social, political and economic problems AI will have t…
ytc_UgxlMrTgK…
G
I've used it a fair bit recently. It sounds so authoritative but is often so wro…
ytc_UgzalYTR5…
Comment
I honestly think AI is better programmed for socialist outcomes rather than capitalist ones. In capitalism, everything is assigned value and tokens can be exchanged for ownership if they represent that value. There are a finite number of things in the world and always will be. Capitalism has to grow by the virtue of its survival, and that means greater control of these tokens that represent all material things in perpetuity. This can only eventuate in utmost polarization between the rich and poor. If there is a common condition that only benefits everyone by advancing, you don't run the same risk of optimised poverty and the self interest of AI won't make it 'compete' with you.
youtube
AI Governance
2025-06-23T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxJ8iIkmz_geeNLzv94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxcKPdNKK2mbFbTvct4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_BZMMJL1SsldaUbZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwwwVTk8KnnPwbItu94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxiL9CiYkTV7Uegzfx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxHFPoqbhKZ2t9xeU94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw87EK6Y1Ng84FaSFV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMnypmi64mzfWuhAZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1Snx7lVgMuhXs4sR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzVdgoWEuYWng7jJ9N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}
]