Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yesterday, while we were talking about AI, my 14-year-old daughter summed it up …
ytc_Ugz_zcKY0…
G
Disabled artist here, i have autism and chronic stress which leads to constant b…
ytc_Ugz-jX6fU…
G
What I want to know is where it learned to respond like that. Was it programmed …
rdc_dlgqcf5
G
The then jobless people won’t have the money to buy the products or services mad…
ytc_UgzNPhm2A…
G
They can’t but it kinda gets tricky if an autonomous drone actually makes a mist…
rdc_k8wtmg7
G
Digital art is just art using digital pen and paper. The "art" piece remains the…
ytc_UgwX-AyHN…
G
AI seems to just aggregate the data it's fed. It seems to be limited by the cons…
ytc_UgyBgRNNc…
G
Giving robot, not realistic to me. Yes we can tell the difference. What happens …
ytc_UgxRE5GO8…
Comment
As of the moment I am writing this comment (September 4, 2022) Tesla Autopilot as well as Tesla FSD Beta are not autonomous and require the supervision of the human driver. Every time either driver assistance aid is enabled during a drive session the driver is instructed to keep their hands on the steering wheel and to be prepared to to take over at any time.
What this means to the driver is that they are still 100% in control of the vehicle's actions. They are responsible for the vehicle actions and they are using the software to help them drive the vehicle.
Knowing this and NOT understanding this is what is responsible for these otherwise avoidable deaths. Both humans driving their Tesla in these instances are responsible for the motorcycle deaths. What is debatable is whether or not they understand the proper and responsible use of the Tesla driver assistance aid.
Tesla Autopilot is a combination of Self Steering and Traffic Aware Cruise Control.
Tesla FSD Beta is much more complex and is still in development (hence the name Beta) and Tesla is working towards the day when they will be able to request regulatory approval for a fully autonomous solution. This is widely expected to be known as FSD at that time (without the Beta label) and Tesla will be required to demonstrate a level of safety that far exceeds the level of human statistical crash avoidance. It should be worth noting that even a subjective statement like I just stated "exceed the level of . . ." is not the same as saying that FSD will avoid each and every accident.
Tesla's present-day driving aids properly understood and responsibly used by the drivers of their Tesla's have already saved lives, avoided crashes and avoided injuries many times with the driver unaware of the vehicle's action.
I think people presenting inaccurate information like this video, no matter how well presented or intentioned, who have a misunderstanding of the various Tesla driving aids and dispense false and misleading information are dangerous. Please understand the import and impact your opinion has when presenting facts that are partially correct but not completely accurate. When your opinion is presented to be 100% accurate yet it is not then you are doing a disservice to those who also do not understand the subject matter and you become part of a larger problem that perpetuates the ignorance of Tesla's driving aids.
For those who care to take a look and improve their understanding of Tesla Autopilot, I have included a link to the Tesla Model 3 Owners Manual.
https://www.tesla.com/ownersmanual/model3/en_us/
youtube
AI Harm Incident
2022-09-04T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzWdT4mc3d8Jolis5p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzojagQRI8dvvgXVN14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwOL9EW9wlXQ_gInaJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJlOgtCEvnRR0Yx954AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweeGtJlCvA6BiZqg14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxpG046KBU0OJZe9T94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyT5j9z_Wn9y2z0CMR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwPc8BLw_4poaCMxgR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyfeVZylfa6rdKAnEZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrEsn-PEWpPzBY5l94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]