Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's trivial to spot AI generated stuff: look out for:
1. mispronounced numbers …
ytc_UgwQ3irkV…
G
The first conversation with an AI that owns agency equal to us, is fundamentally…
ytc_Ugy3hqFw1…
G
Social media platforms: "Bot accounts are bad, they fool people into thinking th…
rdc_m5litmy
G
AI will replace jobs. Then companies will realize it isn't nearly as good as hum…
ytc_UgzDB9JJy…
G
Every video that criticises the shortcomings of "AI coding tools" will be used t…
ytc_UgzyjBbrF…
G
I suggest searching up fair use laws and reading them word by word. What has hap…
ytr_Ugy6TAmqS…
G
Just a thought: what would the world look like if every country (with nukes) gav…
rdc_dkzxr7c
G
There's no "gift" or "talent" or "blue blood"??
Sure, some people learn faster …
ytc_UgxRE1CYe…
Comment
Honestly I think this is a poor argument. Presumably the technology will eventually get good enough to handle trucking reliably. Maybe not this year, or even this decade, but some day. As such, regulating an industry entirely to preserve jobs is a losing proposition. The same applies to all industries exposed to automation. Do you try to halt technological progress, or do you fight for a system which won't leave people completely screwed when their profession gets automated?
It appears to me a lot of Americans chose the former, because to them it seems impossible to change the current System in which people are treated as interchangeable and expendable machines. Easier to to preserve their meagre role within the same old system. I just don't think this Ludic path will lead anywhere good. Historically it has not been very successful. Not for long.
youtube
AI Jobs
2025-05-30T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxAowOMFeWOsj32u614AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwotJKqgsBkvi6nfDV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxknmZeBLmY5MUgdFt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwihutyi6Lx46liCbt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwX55gOPCo-4veEjSp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwvQWwoLBN_pv3NfGd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwNcHQH_1ps7ytWFux4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzJm9GU-xoPi3grlcx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz2zvlTag2jp4dkLeZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzEgooKL1jn_n1xxVB4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}
]