Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I've always wondered how the fact that Sam Altman is gay might influence the dev…
ytc_UgxYUGbpE…
G
I completely agree. Who thought developing a therapeutic relationship with an ai…
ytr_UgwJzMxKw…
G
I don’t blame AI for trying to kill someone or blackmail someone in those tests …
ytc_UgxxfYNrM…
G
every business that wants to force their employees to use AI should be boycotted…
ytc_UgwXDMRYU…
G
Calling AI generators "glorified search engines" is an insult to researchers tha…
ytr_UgzyhDlzb…
G
We appreciate your concern. It's important to have discussions about the impact …
ytr_Ugx_94f4k…
G
Geoffrey: We have the same internal perspective of the kv cache in our phase of …
ytc_UgxMfSZd4…
G
You're missing the point. AI is nothing more than a scapegoat to lay off people …
ytc_Ugxq58dq1…
Comment
There are two problems. The first is the driver. Its seemed to me, that the driver was "gaming" the software that monitors for driver attentiveness (either consciously or subconsciously). Basically he wasn't being attentive but was behaving in a way that defeats the monitoring system.
The second problem. Tesla's "autopilot" software isn't that "smart" when it comes to analyzing human behavior. "Autopilot" or "self driving" should not be used in any of its advertising. There is also better hardware like LIDAR which have proven to be safer. Plus Tesla should be forced to make its software available to neutral third party to evaluate it for safety.
In the end, both the driver AND Tesla are at fault for the accident.
Musk fanboys should really stop drinking his Kool aid and believe every word he says about 'auto pilot" and a host of other things, Its getting them and others injured or killed.
youtube
AI Harm Incident
2024-12-20T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQ482mk7AAlmeAd854AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwn3RhrYM-bEmfzhEV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyuiTCc9Jgr7aqCPz14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6r5vUEb2TTt57CfZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQmizgU6DGL0VC9uZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxgkqOmyMVdhHOCBNp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyWhxzEsg9xwon7Ry54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx4ycPMWYTT3mF6twh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyyLYsj9QUrQidExTl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxfa_1GIRYapbWzojx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]