Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Saggy keep saying "too good to be true".
How is effectively maligning black peop…
ytc_UgypnaQB1…
G
Haha, that would be an interesting concept! While having a robot president might…
ytr_UgyskzGgE…
G
AI being more intelligent than humans is the thing that scares people the most b…
ytc_UgzSRcWOs…
G
_And it's not just that AI will spare you when the apocalypse comes._ That's goo…
ytc_UgyTkVNw1…
G
Given a huge codebase (like tens of thousands of source code files) of an app, s…
ytc_Ugzw6RNLd…
G
Poor lex doesn't even comprehend the negative psychological and behavioral impac…
ytc_Ugz8ICTkL…
G
Thank you for sharing your opinion! It's interesting how AI in movies like "Eagl…
ytr_UgxBsKLGp…
G
I’m so sick of seeing AI clips on Facebook, it’s 99% of stuff uploaded, and all …
ytc_Ugw0E2yq7…
Comment
A self driving truck can't do a VI there aren't censors on ball joints and a few other components of a truck. and fueling isn't AI yet either. You would have to build truck terminals every 50 or so miles apart where a human can do these things. Flatbed loads require a load check about ever 150 mi. and there to many variables when it comes to Over sized loads Airlines are mostly AI but they still have pilots. A 747 can fly and land it's self but the pilots are still there as safety backup. The Idea of fully AI trucks putting drivers out of a job is more then a decade if not 2 away.
youtube
AI Jobs
2020-01-01T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzKobM3NiED2mhCyEd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy--17hGw1fiZLoN3t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4sE9jXXw55cH25o94AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwt8xfdCnW4I2bocPB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEnERSTKOnNO8QXGx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx-anXLywjIuJisHJN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwZZh6gxmn7zsBZ6md4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw0Otd5fmJG3ZJVkgp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjEaZ_McaiKcDQwoV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxypTIRhca2oslZhKR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"})