Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is probably the best argument I've seen on the issue, surprisingly.
I just …
ytr_UgyFLgyPT…
G
I dunno man, Google has a few cars on public roads with a 100% success rate, and…
ytr_UggQOfN1-…
G
Trains. For the love of all that is holy, feed us American autists and GIVE US T…
ytc_Ugw-V0JAm…
G
The danger is not in AI but the liberal leftist agenda that the democrats have u…
ytc_UgwRVQdjw…
G
So what they're saying is The Ai is gonna change the callers voice and then when…
ytc_Ugx6Dv11V…
G
Like you I have... err... been around the block a few times lol. Software will c…
ytc_UgyA3kT8G…
G
Actual people put time and effort into their art. And then those AI people use t…
ytc_UgxstF2gX…
G
I think that the truly safe jobs are ones that are not worth the investment to r…
ytc_Ugyebh25I…
Comment
Key points:
- Corporations have blamed job cuts that they wanted to do anyway on AI to mask poor performance by adopting trendy/cool AI.
- AI companies are here to sell you a subscription to a powerful tool, but will structure it so that they receive all the benefits and none of the liability.
- Other reporting has pointed to programmers being hired, not to create new code, but to fix the mess of AI-generated code. This is far less efficient than letting people create code in the first place, and leads to worse products.
youtube
AI Jobs
2026-03-23T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxF8PTu5iiBjH-UfQp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy75JTazYsEGn2J8Mt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyrq7NHtzV9Qbje9G54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBLJhFPcSBkxxzUcZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwQe4KTUu4F4opia6h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_WQ-6s7VZ4GiLvBF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxOaIK_2maAFfo1-JR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxM9tUrZ4s9XmI21DB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzEFRcuSBNsTq_tiW54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz4L_0OtZatQyNbmbt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}
]