Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Phoboskomboa I'm pretty sure professional artists ACTUALLY still use a pen (an…
ytr_UgwSx4Re3…
G
Umm idk Law is a very touchy subject. Would I want a robot representing me. And …
ytc_Ugx6bvlIn…
G
AI doesn't give someone the right to not pay attention. I have a Tesla myself bu…
ytc_UgybQLh7i…
G
I agree with Dan Martell and Luke Belmar. Some degrees are just a waste of money…
ytc_Ugw1oZ4qL…
G
No, humans are not neural network like these programs are. Our brains have nothi…
ytr_Ugy-H5EJ7…
G
No one really talks about the real reason behind self driving cars and trucks.
…
ytc_UgxY057sJ…
G
Justine Bateman's comment is a prime example of narcissism and self-importance i…
ytr_UgwzxbHh0…
G
It seems more like unchecked capitalism than ai. I’m just cautious of not leanin…
ytc_UgwGYnCrq…
Comment
Sounds to me FSD is to blame then. Yes, FSD didn't turn the wheel and did disengage but if people aren't realizing its disengaging and causing up to 55 similar types of crashes then this is a design flaw of FSD. He also could of had a mechanical faliure of the steering that caused it to turn. We won't ever really know but this crash should of been investigated thoroughly instead of just with the data from Tesla. Which could of also been manipulated. I wouldn't trust a Elon company at this point but I would like to believe that this comes down to a design failure of FSD disengagements not being clear to the user. The company should honestly be forced to not sell this as FSD as it isn't ready to be called an autopilot let alone a full self driving package. Getting away with calling it FSD and then slapping a "beta" tag at the end is honestly criminal when it endangers other road users.
youtube
2025-07-01T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw10Xoy4qNuE9cpc454AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3X8FQFdSli4-Sjdl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzokpawTUcjAPyf1Bl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlkAGkLHZXT8LPuX94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyTWRoNsxkywtp_30F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEVW-qrOfvbMYPhnN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzWwWVbrR-lvyiP-y54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNhlAR37OWIvS-mrh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIMlBIQRorzm6wTMh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyovj8hZzirkx9vfql4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}]