Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@bapa39 If you were to pit a person who’s sole source of information was ChatGPT…
ytr_Ugzb3UHvH…
G
> Leo has been on his shit about climate change for a fucking decade
While g…
rdc_esqbv0u
G
There is no AI race because there is no finish line, the space race was build ar…
ytc_Ugw2khM2h…
G
it would be hilarious if chatgpt lost its shit and started ranting at you like, …
ytc_Ugz4vrDyL…
G
So the jobs which need to be automated like manual physical labour work is not a…
ytc_UgzG7eGEQ…
G
ChatGPT: categories need some flexibility room due to culture
Also ChatGPT: A …
ytc_Ugymelgk6…
G
You bring up a great point! Wisdom indeed encompasses much more than just data a…
ytr_UgzJHmqgE…
G
Cbt self guided courses
The ai.can answer subsequent questions
It could have a…
rdc_jidk9f5
Comment
I'm sorry for the pain and loss that resulted from this accident, but the fault is with the driver, not the car. In 2019 Tesla did not claim that autopilot would work on local roads. It definitely didn't claim it would stop at a stop sign. At the time of the accident the driver was looking for his phone on the floor of the car, not at the road. Having said that, there is no excuse for Tesla hiding accident-related data. However, Tesla has never claimed their cars are currently fully autonomous. While this documentary focuses on Tesla accidents, Tesla's FSD Supervised has prevented many many more accidents than it has caused. I speak from personal experience.
youtube
AI Harm Incident
2025-11-05T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwEsJDUid0UEZfG5q14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyhLhyvJv0pvbG_1ch4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxXrs86U4Od9huckHV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3SeLEcZq3fYJY5914AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwG7lLPDd8pjU4odBx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzccjQoEiJHLn-pTj54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyUiIvVgcgyDk7nUp14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwPcKFBfxoHtFktAtd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzdkFwAPdnEEajdAZZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwv7C45-tWaBNvBkR94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]