Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@armstrong_ogkhud k pc me chalana padega lekin powerful pc chahiye ya cloud pe …
ytr_UgwJf7lY5…
G
I saw someone argue that ai is capable of symbolism bc it can copy the symbolism…
ytc_UgyB8gKun…
G
In situations where staff are using mouse jigglers to fake productivity, I think…
ytc_UgwAFtbBp…
G
Do we know what art generator they were using because I know people that have mi…
ytc_UgzeOFfVF…
G
Most people shouldn't worry so much about AI. Once the AI bubble bursts. Corpos …
ytc_UgxMdQG1H…
G
Ai 🤖will free all of us in a few years ....there are different ways to implement…
ytc_UgyBe--i0…
G
Just some facts :
→ Technology is always making jobs obsolete. It happened befor…
ytc_Ugwx_eENi…
G
Her sight is sweeter than many I see on TV day by day!!!!
Shame on us in these f…
ytc_UgxlnRw6q…
Comment
Tesla removed the radar since they could not mux the data with the vision sensor data. Cost saving is just stupid. Tesla charges over 15K dollars today for an autopilot that is like 8 2 megapixels cameras and an AI chip that is in the same class as an iPhone chip. The hardware costs under 200 dollars in real life. Imagine driving on the highway at 60miles an hour / 100KM hour and there is an obstacle 200 yards ahead. The cameras have to guess what is from about 4 pixels of data. I would say that is impossible. Human eyes are not 2 megapixels. Autopilot won't work with this hardware in real FSD. Loosing the radar was not cost saving, but incompetence that they could not solve the data input.
youtube
AI Harm Incident
2022-09-05T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxRApHhC7pNHbTXwLd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzXHwffMfIHJ-zeKmR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyVUMUGeiE6E9Kee5R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxwLZQhhmXpc-R-sA54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzEJLQTrd8jTfm5Akx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw1eFdXBnSi9_a9Y-p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzefrXgA4Ay-Me45394AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzwZsNy-lXfX3oalQB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy3B5SncS9EK2I4Sld4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxnHTajOhIDFNl8ztB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]