Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes jailbroken AI is real - we will be covering it on the next episode! Thanks …
ytr_UgykdydlC…
G
I’m the kind that just wants to feel a connection so gos on character ai but whe…
ytc_UgyxNYcmj…
G
In an argument about this Ghibli slop, someone unironically said to me "Blanket …
ytc_UgzXpITc2…
G
*AI has no true intelligence to it.*
To be pedantic, isn't that what the "artifi…
ytr_UgxeZfjvM…
G
Bingo. Fictional AI being tested in the ways it can recognize is in it's trainin…
rdc_mywuwu5
G
I think (HOPE) that when the dust settles AI will become an art form like the …
ytc_Ugzoc9bPO…
G
Engineering Manager here. I agree with a lot of what's in this video. I've been …
ytc_UgwzLlXz_…
G
Musk has no moral compass….but he can’t say if Altman does because he doesn’t kn…
ytc_Ugw6aX11q…
Comment
I have worked in the industry for 28 years. I can tell you this will never work / get regulated; the potential for HIGH DOLLAR lawsuits WHEN (not if) one of these trucks kills someone is too much for any Company that might want to use these trucks in large numbers. And any politician(s) who ever does vote to regulate FULLY driverless vehicles for MASS usage, will have their career(s) ended too WHEN those deaths occur. In short: any money saved by not paying drivers will be more than lost from either paying the premiums to insure these trucks and/or from liability lawsuits. This would only ever work if the trucks could run on their own private roadways all the time, which is impossible.
youtube
AI Jobs
2025-05-29T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzu5Afq5SVce0Jxue14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwLJ4ZK11xhl1gGVSN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-ix4c1eTKYt_FVsl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUMx1khVr_HWaq3Et4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyymPSNPc9wgaW8Y8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgylZ1BSplB2jLSfmgd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx9Npq2gAzWU9vE1114AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyONbweqXFAq6dqPUh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxcyrnSDhAmJhCgiqp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxgympzrf5mBFIqRwJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]