Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
> Africa is the future of manufacturing
What do you base this prediction on?…
rdc_et7jzxo
G
Random 5k race included at 16:10 to showcase...police? Is b-roll for these vide…
ytc_UgxoYjxYQ…
G
If you understand how chat GPT, or any AI, works then you'll understand that thi…
ytc_UgxsCGo8K…
G
But OpenAI will die soon. And then Micropenis and Google will give up on this AI…
ytc_Ugzy792k0…
G
THIS is why AI generation is a terrible idea. I've only seen bad things come fro…
ytc_Ugxuym2nh…
G
AI is already destroying us. It started with the AI algorithms used by social me…
ytc_UgyGUoYUH…
G
I'm not sure if you'll see this Hank, but I think some important questions to as…
ytc_UgzCQ-iUB…
G
Notice that his base are the ones who love AI so much, because it gives them ima…
rdc_lq8p7lo
Comment
I am not a of the way Tesla operates and markets as a company, and agree that their autopilot system is heavily flawed and the naming is so misleading that it should be considered fraudulent. That said, as someone who has used autopilot and "Full Self Driving" a lot, anyone who blindly lets the car drive on autopilot or FSD without monitoring is suicidal. No other way to put it. The car is absolutely incapable of driving itself safely. I've put in at least 50k miles driving a Tesla in autopilot and I would _never_ trust it to drive without closely monitoring and being ready to take control.
In the case of the Texas crash, yes the car should have seen that there was a need to slow down and stop much earlier. But, so should the driver, and ultimately it is the driver's responsibility to drive their car. The driver was intoxicated, meaning it was already illegal for him to be driving.
youtube
AI Harm Incident
2025-01-10T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwIGm-P-GlgZfLfSGJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy-ah4dWg82e2XD3AF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOO2dQ-OxJIJKb7eh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxCEtyW9TOm2turxDZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwxie838pcP0MygAph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8_5Lo5l0a9Yn95Gt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy7hlyjlE4MHOHdLpF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuzHjeZM5ExcnbWRd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy16S_ezLM1VYIlU194AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlCVcMIH6dO6KwpB54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]