Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I only need AI to help me cancel my service without talking to an agent…
ytc_Ugwi5ANVv…
G
Hand over nuclear and military decissions to AI, 2nd risk will be realised very …
ytc_UgyWDXRGQ…
G
i am pro-ai, but if it doesn't work, why continue using it? broken stuff shouldn…
ytc_UgybaP9nC…
G
Help with homework? You mean, cheat on exams as well? Gen A will be late in the …
ytc_Ugx0nMlpc…
G
8:20 "if purchasing AI tools from specialist vendors and building partnerships s…
ytc_UgyqA5j4e…
G
There is no scientific consensus or evidence that a "singularity" for AI is inev…
ytr_Ugx1bYYI-…
G
I'm so with you on this.
I feel genuinely tired. It amuses me that there's peopl…
ytc_Ugye6rI2E…
G
> But people are also heavily using AI at work...
Mandated from the top lol
…
rdc_oby1bh7
Comment
If you destabilize society by replacing the human work force with robots you are basically shooting yourself in the face. If, for example Amazon replaces a great majority of its work force with robots, and more and more companies follow suit then eventually nobody will have the earning power to buy Amazon or any other products that companies are dishing out. The result is both us the people and the fat cats that are hoarding wealth will eventually crash and burn. Is that the future we are building for future generations? AI is the anathema that will bring poverty like nobody has ever seen before, and with it crime rate will expand exponentially. We are on the wrong path to a more sustainable future, and the amenities that AI is furnishing at the present time will return to haunt us in the not so distant future.
youtube
AI Jobs
2025-11-30T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxvKLVNRl-zP2UKC_F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxe1Px51xE8x_yNIzp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9gL7PqIJKY86t7Ll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgynWLC4GC8SKVc0AmB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxGLIwDrCSWLvqXDpp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzoEEyEwu7hDB6_moJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyRkqztmjWp2fC1lN54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxb6o1vTop8SmgaMiF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxijG2opGcSbsERyXl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBDwmm9DXtxr1TyWJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]