Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
my sister likes ai art a lot, she took pics of herself, generate it into ai and …
ytc_UgwOC-As0…
G
Im definitely gonna have to look more into this. Im slowly on my journey to beco…
ytc_UgzSPn8k2…
G
❤ bring on AI robot companions! I’m all for it! Screw traditional marriage, dati…
ytc_UgxcCOwA1…
G
Until ai learns how to sell Crack my job is safe from being taking over…
ytc_UgxOqGfGt…
G
yep just ask AI what it will do to humans and you won't like its answer😉 and the…
ytc_UgzgIYkZ1…
G
The issue is that art is being taken and used to train and test models without t…
ytr_UgzCdHWyb…
G
I'm totally in favor of driver-optional trucks and cars! I'd love to get in my…
ytc_UgwGwBNZK…
G
I remember getting into an argument with some loser who claimed that “although c…
ytc_UgzND99xz…
Comment
If you ask any AI about fully automated trucks, the AI points out that AI cant drive trucks in many situations. This includes heavy rain, fog and snow. AI cant mimic human intuition. Anyone who drives a Cascadia or Pete 579 with ACC (Adaptive Cruise Control) knows how much if fails. In a little rain, your ACC will cut out and will only work on speed. If safety was really the concern, instead of making automated trucks, they would be working on an AI co-pilot that could work with the driver. Another tool for the driver. As for Automated trucks.....it's a disaster waiting to happen.
youtube
AI Jobs
2025-05-29T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyu3uor8MRcGBMjFsh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzFpJjgJQCSZQIVIUx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwPijGwIQf4mMEwW4F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxjaz8T4LgSsgP3y3l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_EjQswA5Oi88vU-p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzsYtIYSFdRibDeS6V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyLQ4NV-hcyNdCW9xZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyg_uNLiRKLtIdVODd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyz1qfjFywK4-wvrkt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8DRJV2xUpbaAaIDh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]