Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One of the members mentioned https://janesdueprocess.org/ - specifically for min…
rdc_hbnfcwx
G
But with AI it can't. There's nobody to keep it alive. It takes people for build…
ytc_Ugw5Zq7UC…
G
AI not the problem LIBERAL MEDIA IS THE PROBLEM. Y'all been spitting lies for ye…
ytc_UgyYhdtnJ…
G
Maybe if AI consistently and reliably become racists and anti Semitic then perha…
ytc_Ugzr2htxa…
G
@TheDaringCreatives as one who makes the machines; good. Less loss, less mistake…
ytr_UgzHPhAzm…
G
Yes, I agree with you but I’m mean babysitting that’s nuts and crazy and insane.…
ytr_UgyNDDvRj…
G
can't say i like AI art to any extent, however i also don't get the wild obsessi…
ytc_Ugw-FtRQv…
G
If AI companies ever plan to pay for the training data.. I would imagine a reall…
ytc_UgzSJedXi…
Comment
This feels a bit like a soft hit piece. Certainly I don’t think the word autopilot is that helpful, call it “driving co-pilot” or something. But, the warnings and information are that you need to be alert. And it tells you if you aren’t. So old mate who crashed into the truck, had 19 prior warnings.
People will die with this technology. Yes sure. But people will die on the road every year for a multitude of reasons.
Of course investing, find out what needs to be improved. And yes full self driving is not full self driving. So change hat name as false advertising.
But don’t stop autonomy’s driving technology because people will get killed. Stop cars then, because people will get killed. The benefits to humanity of having a strong autonomous driving solution are so great, we have to allow society to take some reasonable risks in pursuit of it.
How many air craft crashes have their been in pursuit of that technology. What about electricity. It’s a series of tradeoffs.
youtube
AI Harm Incident
2024-12-16T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwZ0QMTSLn_IXsWIW14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzlaUE37fErvz1Gfr94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzLBnJ89jmexNedIr54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxKPN83yn6Rg8kGWx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtBXCZJavN2pIlLaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2-aGI8lOYbF7rDKh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRqCVrDFG1dfi8HMx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwD-BfjOnjAlnzsQ3B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyvFlVbzmeNXX6pOwN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx3s5m15eyDiF4bDqx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]