Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your feedback. Could you please elaborate on what specifically did…
ytr_Ugzrz9Ooc…
G
Isn't it funny how AI I can build one of the most perfect f****** cars and re-en…
ytc_UgwxGQTb0…
G
Stuff like this is why I’m totally against ai. Giving up on your human creativit…
ytc_Ugx7lKYPz…
G
On a more serious note, isn't it strange that the acknowledgement of "is there a…
ytc_Ugz5dAvY9…
G
For anyone who does not believe it by now, AI is demonic and will control every …
ytc_Ugxv335qt…
G
The upside is: Artists can use AI to produce more art, and meet the never ending…
ytc_UgyyLZ3Od…
G
I had Chat GPT make an image of Caesar wearing a crown of golden thorns, a purpl…
ytc_UgxX5mCwR…
G
Being ax excellent prompt writing is a good skill. In the near future movies, sc…
ytc_Ugz8JFwYV…
Comment
Tesla's approach is moronic. To keep it simple, AI is, at its best, inferior to a really good human driver. Even a really good human driver cannot drive safely in many conditions that really occur where the right call is "I will wait till this passes before driving". Only a great AI system will vastly superior sensors including Lidar can hope to be "relied on" to be as safe as a good human driver.
Lidar, like all tech commodities, comes down in price over time, currently about $60 in chinese EVs. Call it $160. Elon Musk is just a f'ing idiot and asshole to say that human safety isn't worth that much. It's really just that moron's ego refusing to accept that lidar is better and necessary and he was Totally Wrong.
youtube
2026-04-12T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwUNjCbkGc-3padR2R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzmp4MLL2N-QCb4h514AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4PuD6H486c6uBGWB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy8PCQEl_OvZr8lliF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzQsCQgrGiekTFp3X94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyQjSnN_vOyB6iQz5t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz9tKbbWwzl_-eo4Nt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx9D4vLv1x5ir9SKER4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZ9TAw6c-j--eNXAF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyWe9hB10R-k-cLYhZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]