Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sure, but there's no avoiding that. If the US puts the brakes on AI development,…
ytr_UgzinWPBk…
G
Data centers are serving only evil people with evil agenda. Water, land, all pr…
ytc_Ugwhxlgod…
G
So some nerd decided they'd like build a robot with nuts and screws, because may…
ytc_UgzNwnKu1…
G
They definitely are smarter than humans- I’ve seen a thousand films warning us a…
ytc_Ugyq1a2QG…
G
one thing tho ..even if the AI have a data set of public domain images , it does…
ytc_UgxyOpbo2…
G
Yup, it's purity culture and the judgement, bullying, shaming and trauma that it…
ytr_UgxbdwNQq…
G
You have to be an absolute dope to think they’re going to pay you six figures to…
ytc_Ugz48dtJJ…
G
I don't care if Dean Ball got good grades in school. Dean Ball should not be try…
ytc_UgwYChjyQ…
Comment
Here you go. You asked for it.. As of early 2026, the most current data indicates significant safety concerns regarding Tesla's "Full Self-Driving" (FSD) system, particularly when it fails to detect reduced-visibility conditions or when drivers are not actively supervising the system.
Key 2025–2026 Safety Data and Investigations
Expanded NHTSA Engineering Analysis (March 2026): The National Highway Traffic Safety Administration (NHTSA) escalated its investigation into roughly 3.2 million Tesla vehicles (2016–2026 models). This "Engineering Analysis" covers FSD in low-visibility conditions (fog, glare, dust).
Documented Incidents: NHTSA has identified at least 13 crashes (as of March 2026) in which FSD was engaged, including one fatal pedestrian accident in November 2023 and other incidents involving fixed objects and first responders.
System Misuse & Failures: Investigations indicate that in many crashes, the system failed to detect common roadway obstacles or did not warn drivers in time to prevent collisions. A 2024 investigation concluded the system lacked standard protections, leading to incidents where drivers misused the system in ways that "should have been foreseen" by Tesla.
Unmonitored Risks: While Tesla continues to promote FSD as requiring "active driver supervision," a report in late 2025 suggested that recent software builds might allow drivers to take their eyes off the road for over 20 seconds, increasing risks if the system is not actively monitored.
Delayed Reporting Investigations: In August 2025, federal regulators began investigating why Tesla apparently has not been reporting crashes promptly to the agency.
Key Safety Concerns
Camera-Only Approach: Some experts and reports argue that relying solely on cameras without lidar or radar is insufficient for fully autonomous driving, leading to failures in detecting obstacles in poor visibility.
Failure to Recognize Traffic Signals: Reports filed with regulators have highlighted instances where FSD-equipped vehicles ran red lights or incorrectly stopped at green lights.
Sudden Disengagements: The system has been known to abruptly hand control back to the driver.
youtube
2026-04-15T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwjHhB7Arb0tweKGQl4AaABAg.AVgRDFOjjPUAVo0Auke0xW","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwjHhB7Arb0tweKGQl4AaABAg.AVgRDFOjjPUAVo0l78AiH0","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgyIC1-omp9k8IVJSeh4AaABAg.AVaqbjQOGiSAVb-eAN2TPG","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwoqrQfSsWT0rHHKfB4AaABAg.AVak9rxylr-AVb-lO0gvy7","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugzt0EYgdL_cnVCZn8F4AaABAg.AVZ8fr_FmbgAVZV6ArSMw3","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugytv2UbirMNcU-q7BN4AaABAg.AVZ2vkrG-MqAVZUbf7rp5_","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz1wdSecgLYYCEG6DF4AaABAg.AVYa3dJi6q1AVYiyxkjRPQ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz1wdSecgLYYCEG6DF4AaABAg.AVYa3dJi6q1AVZ-HpA88cr","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgwXxrgTbJyGtayrf9x4AaABAg.AVYISpvLIa4AVZVac5PkZd","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugwy58bG9BoBnO11H-R4AaABAg.AVYFPAKgUakAVZSmSVC8DJ","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]