Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The whole concept is doing more harm than good in multiple ways. It's a perfect…
ytc_Ugx3FskHT…
G
I like that video actually points out that ai art isn't as good as human art.
B…
ytc_UgwykXD5Q…
G
What I think is crazy is how we all just look at this and go "at least it's not …
ytc_Ugz1AziX9…
G
Humanity as a whole is incapable of not building dangerous things. We've got a …
ytc_UgyFCLLMu…
G
With there beeing no real cyber security crime investigations AI is a huge probl…
ytc_Ugw1VI0qM…
G
The solution is pretty simple too. Only issue is that it doesn't make money for …
rdc_fwizfet
G
Tell me can Ai deal with an emergency situation where the patient is losing a lo…
ytc_UgzvgRLJ9…
G
As a fanfic writer who wants to make this a full career, I am 100% against gener…
ytc_Ugzb4vdQ_…
Comment
It’s probably gonna take at least 15-30 years before truck drivers really start losing their jobs to AI. A lot of folks don’t really get what’s going on… Aurora is just working on building and training an AI to get better at driving vehicles. They’ve still got a long way to go on that. But yeah, eventually it will happen. On another note, a ton of our roads and transport systems would need some serious upgrades to handle AI vehicles, especially big rigs, before people start feeling any real pressure. These infrastructure upgrades are still decades off! These people are still just trying to sort out server space and power issues to keep these AI systems running! 😂
youtube
AI Jobs
2025-05-28T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyJAwt9bj_gbxGFDOt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxmhpewKSINH91CejF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyS7c7Tjg2hSZYpaY54AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFV-IFOI4YmOKjkVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz8ad5fz7s8Awyr6Zd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnaFbbgEJw4x7uPRV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxtJ6JRrcJab9_gKnN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwVmxL0f5XHPE4QEH54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzYF0D9IjSjCJ6eD_F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyiaRuzeA9oZQIV3Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]