Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also the intern is probably going to be the one actually working the AI because …
rdc_j6e9d3u
G
Sure, you could sit down right now and use A.I. to start any project you want. B…
rdc_oh3knlp
G
Ai never can experience the world like we do. Ai always learn from the info we h…
ytc_UgzYlTSKX…
G
artists wont have to get a new job because of AI theyre not employed anyway…
ytc_UgzpyTyeO…
G
Yeeaaappp… and people misunderstand the value behind these AI tools.
For exampl…
rdc_nlzro1r
G
Try it again, but ask ChatGPT to write it down in the chat, first. Reassure it y…
rdc_kj0xqci
G
During industrial revolution humans lost strength to machines now, In AI revolut…
ytc_UgyxzjaqF…
G
Ho-ly.... This was the most sobering video I think I've seen in a LONG TIME...IF…
ytc_Ugwhd7rnJ…
Comment
Think:
If companies manage to get AI that replaces humans in each of the existing work spheres... how are people going to earn money to sustain the capitalist system?
In a certain period of time, the domination of AI in knowledge and work will force humanity to enter a new system. It is up to us whether we begin to prefigure it as a humanistic system where everyone would fit, or a dystopian authoritarianism.
youtube
AI Governance
2024-01-03T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzbn6a_dwlmjolXoL94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZbTEQqncUU7eMDiZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy3nFHU5vnBwMHx7Oh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwDmteD6MITnX0p-Vp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyq94my2nvvbl6S89V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgytGcExoFcvXVFwLVl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4txTp5ZnpGYfAost4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZ7y2YwaLVeycwMBV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4gMFXd6ZPfLWnLjh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzqV8LylRk_ZCLlB-R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]