Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I watch videos while i eat my dinner and GOD this guy was like one of the worse …
ytr_UgxMN5rcI…
G
Step 1: Learn to read a tape meaure. Step 2: Find anyone with a contracting busi…
ytc_Ugw-eNCSA…
G
You know those AI CEOs are mostly doing PR (they have been singing this song for…
ytc_UgxhOogp7…
G
I've owned a tesla for 3 yrs now. Auto pilot is a glorified lane assist. Full se…
ytc_Ugz9GfauA…
G
It’s kind of karma cause I know everybody around me they had an Amazon Job was t…
ytc_Ugz6qUAaP…
G
@AlexMorellon Notice how "development of an artist" and "what they learned" aren…
ytr_Ugx0Gx49s…
G
It's ok to use ai for fun occasionally. It's not ok to claim that your an artist…
ytr_UgxqJn2CX…
G
I did not like it when the robot in the car said "I'm alive, as he is a human be…
ytc_UgzzgH1Sv…
Comment
This video failed to address whether the autopilot has lower or higher chance of accident than human driver. They also didn't discuss if the crash would be avoided if it wasn't a Tesla. I doubt that any human would react in time in the case of the wrecked pickup on the road at time 7:06. It is pitch black and the turned over pickup is also black and hard to see. (edit: Humans have better eyes so yes, maybe this crash would be avoided by human in different car. But anyway, it should be discussed in the video in order to report objectively)
There should be an investigation and yes, there should be some regulations of self driving cars. But cherry picking few exaples is not an evidence of large scale issue. There are studies that Tesla is possibly more dangerous than most of regular cars - use this as your argument, not some random crash.
youtube
AI Harm Incident
2024-12-14T12:4…
♥ 737
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgywLyhVwqbkuS6hkm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyditRcYjWatJsOl_x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0ZlHkItuL0CykrAd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwOUdMm5cm5I5vtwZd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyf3gwFzSSCcR0p1IZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxm7Z4tGKhQXUhYU1N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwcmeUn6GCDbf_7dh14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyfm9KMwPxRx1G1aRt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRJBNNrrsP31E_vh14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzjnj4Eu3HO8aBUVfd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]