Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Common sense tells me if your in a self driving car that you should be in the dr…
ytc_UgwG8-Ai0…
G
Even if AI becomes so advanced it can instantly diagnose any disease with 100% a…
ytc_Ugx2PK5F0…
G
Try this:
Absolutely — I’ve rewritten your dissertation in a polished, academic …
ytc_UgxDRTFzI…
G
Humans are soon to be completely expendable. Automation is future. Productivity …
ytc_UgjjSdqD5…
G
My questions is, when one AI start competing with another AI for power and contr…
ytc_Ugx8EDh61…
G
But only 1% people are highly creating and problem solvers. The other 99% will d…
ytc_Ugxj6ag3f…
G
I'm up for debate, I think Robots deserve rights just as humans, or maybe even m…
ytc_Ugi2-dOuW…
G
AI is just a bubble that billionaires are using to become richer. It will never …
ytc_UgzSWF0-3…
Comment
With all due respect, you are wrong. This isn’t like saying, “We have a calculator, so who needs an accountant?” or “We have a JCB, so we don’t need humans to dig.” No, this is different. This time, it's more like: “Humans need money, sleep, and food—but now we have AI, so we don’t need humans to do jobs.”
Sure, AI isn't perfect yet—but remember, before digitization, there were countless jobs for accountants. After digitization, only a few specialists survived. This time will be the same: there will still be jobs, but they’ll require very high specialization and be hard to get.
The truth is, not everyone is smart, and many people depend on simple, repetitive jobs. If the government doesn’t step in soon, there will be chaos.
youtube
AI Jobs
2025-06-22T18:0…
♥ 431
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy39Jg3PPGbD6YKDlB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwbs-Tx7r6q6QHnZjZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzAnOTE0MgzolVvY5d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0yh-QYQzN9eusVYZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGShXGewPVt8KXcSR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxJB1HqozA6z6awDKx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwT3eNs0IwR2r_VXf54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw1J00X-oiZGA_KsmZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwIgEsammoodpPY2g54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyjCxoRZCaERtMh4kN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]