Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you are a construction worker you will survive this for awhile anyway till th…
ytc_UgwqlOGGx…
G
Eliezer, at the least, has been advocating towards provably aligned AI well befo…
ytr_UgzsZXVqH…
G
@nike5428 Formally?
CS: Programming language, Discrete Math -> Algorithms.
Mat…
ytr_UgzO2PPmK…
G
The basic error - "I guess it is who is controlling it right?". That is not how…
ytc_Ugx9sZllA…
G
C'est déjà trop tard on a des drones qui tuent des êtres humains, l'intelligence…
ytc_UgyK0v4_t…
G
Israel used an AI system called "The Gospel" in Gaza to automate strikes. They w…
ytc_Ugys8tDze…
G
I say don’t separate it from us. I say give each of us AI. But each of us is res…
ytc_UgwROJ_o8…
G
If AI wipes out the working class, who’s gonna buy all the products and services…
ytc_Ugyj7xPRt…
Comment
As a security engineer I can guarantee Aurora and automated driving has or will be compromised resulting in a total disaster. Simply crawling the DW I found multiple accounts for Aurora employees and executives. That tells me they have been compromised and they have very poor security. If they were compromised and these trucks are on the roads I can see this turning into a terrorist act situation because it takes very little work on the attacker/bad actor side to slide a root code giving the attacker(s) remote access/execution abilities then close the back door so no one notices. If Aurora plans to push this automated driving thing they better invest "$20+ million" in real cyber security and hourly source code audits. That should be a min requirement from the US government but you know the name of the game in good ole corrupt broken America. Its not what you do its who you buy and politicians are for sale!
youtube
AI Jobs
2025-05-29T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyu3uor8MRcGBMjFsh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzFpJjgJQCSZQIVIUx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwPijGwIQf4mMEwW4F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxjaz8T4LgSsgP3y3l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_EjQswA5Oi88vU-p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzsYtIYSFdRibDeS6V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyLQ4NV-hcyNdCW9xZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyg_uNLiRKLtIdVODd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyz1qfjFywK4-wvrkt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8DRJV2xUpbaAaIDh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]