Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI is supposed to replace workers, who then don't have money to fund the econ…
ytc_UgyZlVkuX…
G
We're in 2019 and we are still at the same point with the driverless trucks, car…
ytc_UgxWH_Ht4…
G
I graduated with an applied mathematics degree somewhat recently, bottom line is…
rdc_nntfw1d
G
We appreciate your opinion. Sophia, the AI robot, is designed to showcase advanc…
ytr_UgzzQ5M8L…
G
How about we just don't program super-intelligent AIs at all? Would probably be …
ytc_UghD0PHvZ…
G
Ai is an extreme echo chamber and that, is the central tool they use,...it becom…
ytc_UgxuoN2DG…
G
That's OK. Ai replaced my job so Amazon isn't getting any money from me. I'll be…
ytc_Ugywwv9jl…
G
They're always more believable when they don't have their lips slightly apart wh…
ytc_UgwC2FCgg…
Comment
Sounds like AI will lead to hyper deflation. Unless a business caters only to the ultra wealthy there will be no customers if most people don’t have jobs. Stock valuations will plummet and it will be a race to the bottom to keep what customers are left. It doesn’t seem that the current system of capitalism will survive long term. If we survive then hopefully we can just live our lives and not work all the time.
youtube
AI Governance
2025-09-04T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzG8qxBH8Jn1J5vjal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxL7wjCnPov_0F2e_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYRJbsg_XI7_975s14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzG1hGPP6vTfWohphJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxSidap1D5PSLQXfx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxA8KsdJhJbLgpyjn54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXFf67CpKwRc58r8V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgylJ8UaXTpp7H4Ef7V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwcgEHAq3RlGR3bMIB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjSfFzUCwSxsxCdqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]