Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a tool, not meant to replace people. If companies decide to replace people…
ytc_UgwpAl3ia…
G
The end…AI wins, majority of humans will be death and a small portion Humans wil…
ytc_UgypmE2AX…
G
So he agreed to fighting a robot with Metal Hands?😂😂 Yeah he had a better chance…
ytc_Ugx2HhmWz…
G
I've used ChatGPT for 3 years. If you have, you would know it is not "smart". It…
ytr_UgzQmbe9G…
G
the worst part is, he submitted like 5 pieces of ai "art" to the colorado state …
ytc_UgxTlJyH-…
G
well yea but plumber is just an example. It really is intended to mean "a labor …
rdc_ohn4yfe
G
What about ("Mandatory User Intelligence" Awareness), prior to downloading an AI…
ytc_UgxZw9Ult…
G
@GorgieClarissaI’m saying that mistakes happen. More often from humans than not.…
ytr_UgwjPS_x-…
Comment
I doubt the AI can see that the driver in the car ahead or beside is agitated and erratic. Therefore likely to make poor driving decisions. Or, has their face buried in a cell phone. A human driver knows to watch that car closely even if the vehicle hasn't made an erratic maneuver yet. I'm highly skeptical that Aurora's claim of driving accidents to zero is realistic.
The AI may be able to react faster and to more simultaneous physical changes. However, it can't predict human behavior, but a human driver can. A human has a sense of prescience because he's been a human forever and can likely predict other humans much more accurately than any AI.
Insurance fraudsters for example will do there best to have a field day with AI trucks. They may or may not even get away with it, but there will inevitably be collateral damage.
youtube
AI Jobs
2025-05-28T19:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzDOHhxX29omNX39hF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy67qCQ6zk0hyHVB6R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwBydwcpG1KUBRsLTd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxxZ7DDgbQEy_UfEBJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxWd8BqnReOw1LcwuZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyC3jcHwnM34nCzl_94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyssjCGfTpiyM5p_sd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxs6eZziVzCCrUPEyR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyMTzEYR2oUTIkXWkB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy2P69fAfzYOnqz61x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]