Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I completely agree; AI is very dangerous for the mind, language, and brain healt…
ytc_UgyRGvD3W…
G
Just tell everyone out of a job to buy a couple shares of GameStop problem solve…
rdc_gkq8vk5
G
As is having an educated workforce. During the colonial era the last thing the B…
rdc_et835n8
G
Residual negative effect, or initial attraction to AI in the first place? An int…
ytc_UgzC8_QBc…
G
Hey @mohdomartokyan7003, thanks for your concern! But have you ever considered t…
ytr_Ugw-wtZ34…
G
Sam Altman, OpenAI, et al... were ALL practicing psychiatric medicine without a …
ytc_UgzPqA9cz…
G
I wonder why he didn't learn that from Terminator, The Matrix, I, Robot etc or f…
ytc_UgwfBV6Z2…
G
nah these kids lucky, i want AI to replace teachers, my teachers are very bad at…
ytr_Ugzr_NLRH…
Comment
Well said!
“Automation in theory should have been an asset. Organised properly, we should all have a better standard of living and work only half the number of hours we did before. What’s the actual result? 20% of the population have no job at all and most of them have little prospect of ever finding one. The ones that are lucky enough to have employment get ulcers and have heart attacks or mental breakdowns because they have to work twice as hard trying to keep up with the technology which enables them to produce the extra goods that few can now afford to buy. They have less leisure time to enjoy their hard-earned money, and when they try to spend it, they have a guilty conscience because so many people are either starving, or without employment or even hope for the future.”
-- Allen Carr, 1995
youtube
AI Jobs
2025-10-09T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyZLzKgIDYGvs4ZxOZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdVVhTA4ztgZLUvuR4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugydema6XWlsWnql2gJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWGgwR0TypDFKZVVJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw2OnVK69rC6WOBPPd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxDVtRvdUF2UWb_SOh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw65IiBM0Urk4NBFod4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3XC30nlZEZsVhg694AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw_Cx_TkB1MoHVgvaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzzfqp4O6X7vRZryRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]