Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is wild to me that idiots here are saying, "I tried an LLM, and it gave me a …
rdc_mrroass
G
I didn't watch this video entirely but it gives me some uncomfortable feeling I …
ytc_UgzzVmZUb…
G
AI can’t get a simple math formula right sometimes and you wanna do surgery with…
ytc_UgxpgZV8q…
G
Agree. Plus AI sucks up so much energy resources, not sure ppl will support nucl…
ytr_UgyOrEQDx…
G
they are making the AI smarter by giving it for free the more people use the mor…
ytc_Ugzz_vmpM…
G
We don't exactly know the inner workings of Google's ai systems, they have been …
ytr_Ugwp3KI1W…
G
To be honest, I don't even know what kind of jobs you could do? Even low paid po…
rdc_gkq7x5j
G
Why would anyone be so irresponsible to be driven by a robot. You are putting th…
ytc_UgxtDRJLJ…
Comment
Ive been driving a Model 3 extensively this year to work for Uber, who is paying for me to go back to college. I like saving a little on gas (I say a little because superchargering is also pretty expensive in California), and passengers like how smooth and fast it is and that the interior and exterior are nicer than the typical Uber X. However, I was at first and continue to be stunned by the horrendous number of software glitches including ones that lightly or severely effect autopilot. It's not surprising to me at all that Mercedes (and next year BMW) have already achieved level 3 autonomous driving, while Tesla is *still* stuck on level 2 all these years later after pioneering the technology. Getting rid of the radar was a massive mistake that sacrificed making gains with the technology and driver safety for bigger profits and shareholder wealth.
youtube
AI Harm Incident
2022-09-28T02:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxiRkUzZpKE0MHvuQx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyngIdueVXYZ6t2a3R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyk03isJXPZP25Be_t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx9TKlxy9mL8PYzY5Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzQv93FgfgN77M3hr54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw2N2N6EXBiCSds4ZZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwrM6T4iwXKLVTZmEp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyWqBe8eZCjOuP2nIJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxSYXT7WLRKoIPUXfF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxlcLe3e1P9MJsz5oN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]