Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I enjoy your interviews, usually. :)
One thing I would like to point out, is B…
ytc_UgxXsLVuB…
G
funny stuff is ive seen that happening without ai already.just took some esoteri…
ytc_UgzgUe6Zw…
G
I had to re-read and re-re-read that 2nd comment to try to understand what this …
ytc_Ugziwvz4B…
G
@davidharrow9025 these have nothing to do with AI, but with the problems associa…
ytr_UgwKtRuB9…
G
Every instance of generative AI I have seen is giving incorrect answers to basic…
ytc_Ugx4E3fVb…
G
You know, dare i say it, AI is not outright bad. Its these idiots who use it wro…
ytc_Ugytkv3k_…
G
Ah yes, the classic "We built the Cube of Apocalypse from the novel "DON'T BUILD…
ytc_UgwtYa7kH…
G
I think using Asmongold and XQC to argue against may have lowered the bar too lo…
ytc_UgyMI1Ahx…
Comment
I get that automation like Aurora's self-driving trucks will impact a lot of jobs, but I think it's ultimately a good thing for the country. Lower shipping costs mean cheaper goods for everyone, and autonomous trucks can run 24/7, improving efficiency and safety. Yeah, it's disruptive just like when we replaced lamplighters or telephone operators but over time, new industries and opportunities emerged. The real issue isn’t the technology, it’s how we help people adapt. We should focus on smart retraining and support, not fighting progress.
youtube
AI Jobs
2025-05-29T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw1-0D4hdDrheMCS3x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdlzE_ekv24EBAA0l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxG-V4eFsj0eFx35Np4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzbGfVgh-EmNA6yjAR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwwn-UxDjb0v5oL7cZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxEHvDJMz85wK5sPOB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx8PM84lo6Kr_OkqnB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8MTOHngSKZNnCjSh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxdb-ONHKai7tvmwXF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy4VMaSZkbr_nxjNCp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"})