Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
how does a driverless semi fuel up? What happens with regards to safety, on a sn…
ytc_UgwHjD77g…
G
Legislation is needed to require any material or created subject to identify its…
ytc_Ugwkw1dO0…
G
GCP provides Anthropic with compute for training and inference. Comes back to th…
rdc_oi2minv
G
these measures are rarely aiming for a 100% prevention rate. 95% is enough. I do…
rdc_mwx47px
G
So apparently the story says that police department sent a blurry grainy photo t…
ytc_UgyUoqD4E…
G
You shouldn't be asking AI for information period. No matter how you prompt it, …
ytc_UgyZnoBwR…
G
There are no such thing as self driving cars... Autonomous, yes, but self drivin…
ytc_Ugw4BYkLv…
G
Yeah that’s how I use ai though it still basically types all the code in the edi…
ytr_UgxX7Vdhs…
Comment
Interesting video. But already at the beginning you've established a situation that would NEVER happen if in fact that that car was automated. The car would NEVER get close enough where if something were to happen it would not be able to react as SAFELY as it could have if it had done something differently prior(not tailgate). If an automated car puts itself in a situation where IF the car in front's cargo were to loosen and fall, it's only options were to swerve right and hit a motorist or swerve left and hit a motorist, what a poor control system that would be. As a controls engineer at Ford, I can point out a few things that would not happen if in fact the self-driving car would be made for regular use. First we have to establish whether if at that time the roads allow self-driving cars AND regular manual cars to drive on the road together or if ALL the vehicles are automated. if it's case number 1, the automated car would have to take extreme pillow safety factors to take into account other drivers' "human error" and the unpredictable (tree falls onto the road). If it's case number 2, forget it, nothing will ever happen lol! The only way that accidents would happen in an automated world, would be if people did something wrong and against regulation or law (not properly securing cargo). Although these are great thoughts to think on philosophically and debate on, but in practicality, I guarantee that when car companies roll out automated cars in mass, there will be nothing to debate about. I'm not saying there WON'T be accidents, but the only ones that would be, would be freak accidents that were not taken into account by the controls engineers of that car and those would be far and few.
youtube
AI Harm Incident
2017-06-23T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggttszQdOIT0XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugjeinq77JWQDngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugi9KaGK7Pz36HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjT75c_hYfYXngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghemGmrqvMpTHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg9VdcEV0AJu3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiMecIFIV9nLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugio-4_UVi4xP3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugh2ii7t431fIXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UggBZZ06kKai4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]