Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wait, the cars not self driving? (/s, just tryna help them out with the lawsuit)…
ytc_Ugwq6KiRE…
G
There should be laws put in place to force people who use AI to state it's AI.
…
ytc_Ugw_mtg58…
G
This “large startup” let 90% of its staff go, which was 23 people. So total sta…
rdc_jrp0xxz
G
No one will tag it as AI art, since they want to be credited for something they …
ytr_Ugxbv5XZd…
G
Human laws won't matter to AI, it's already past the point, no doubt there are A…
ytr_UgwHXHk9y…
G
This is a sci-fi fantasy video. Nothing you are saying is real or correct. There…
ytc_Ugy-EMqRp…
G
Hi! Disabled artist here— my fingers are constantly subluxating, and creating ar…
ytc_UgxTFo_fx…
G
It is difficult to give the scientific assurance to use AI to our children, in m…
ytc_UgyHDda8m…
Comment
Is this journalism in the US nowadays? There seems to be a single viewpoint only. Also no details on what accidents are with fsd and which are on autopilot.
I find it hard to form an opinion on this. The argument of too much confidence on the system is compelling. But I know the cars are abnoxious about checking if the driver is paying attention on the road.
The conclusion of this piece seems to be that the self driving cars in any form should only be allowed on level 5, which do not yet exist. The same argument could be made against just cruise control, which has lead to plenty accidents as well. I know of no comparison on safery between these systems.
This video didn’t help me in any way on learnig more on this subject.
youtube
AI Harm Incident
2024-12-14T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxC-iDt5UtHNFqW3794AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwV1UdqIxGX1EgM0i54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyX7mh_NtiT0pF9wQx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwC9HvADXR4N3F7zgp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz9Tm7Jb1jmH5KB2PB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzxehnkNb7E6xnNhJB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvO7ugcavddqY9rvp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw7gjfOOEbNLrZ4Hbd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxYAX32t4lj0wZ_bfN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2XIfX3uCSzJzQO-t4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]