Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I disagree with you saying this is different to a human way of learning things. …
ytc_UgxjSzqrf…
G
Ngl, 90% of those AI slop defenders are less tech savvy than me. I've been stuck…
ytc_UgznI8xZJ…
G
Honestly how many of you wants to talk or chat with an AI bot instead of a real …
ytc_Ugx-aCfZ6…
G
In my opinion, this isnt primarily a problem of facial recognition software, but…
ytc_UgxyUmxNr…
G
Will we reach AGI was the most important question and he just said yes we will. …
ytc_UgxENsF04…
G
Seriously tho, it's never the AI stealing your art, it's the AI companies steali…
ytc_UgyTqPL1k…
G
I’m a little bit shocked that it is possible to have a conversation like this wi…
ytc_UgzZ2JRlJ…
G
The average American seems to have a hugely over-inflated perception of what com…
rdc_fvy53ir
Comment
This is one of quite few things I disagree with Bernie. He brought up, for example, the lost of drivers due to driverless AI trucks. I ask, why is that wrong? As long as it is proven safe (which many reports says AI-driven cars have less accident rate than human-driven cars), I honestly don't see the issue. When the motorcar was introduced by Ford, millions of jobs were lost from horse breeders to horse riders to horse poop cleaners. When the lightbulb was introduced, millions of jobs in whaling (which was the largest industry in the world) was lost. We mustn't reject techonological advancements to protect jobs. Next, on Bernie's policies such as mandatory profit sharing -- we have to be realistic, it's not possible. It is done in certain cases under motivation theory for C-suite executives, but not a universal solution.
youtube
AI Jobs
2025-10-08T08:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwqNC-Dpz5RKQhzvoF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwvkt5Cw-wViSMfB054AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxfhQYjisc4HpWhg794AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgykCqM41aTcC-_79pV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxjX9Ykr-ZEzofaH314AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugynq6TuKZjW1xayshp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw9KrVOPU-9JJOgZzx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzIdgAGLoAYH076dlh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzH53xeEXYBib_YW-x4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYc_hUMCIB7wk7VCN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]