Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, saw the original fight. This was everywhere, and no, the robot here is fak…
ytr_UgxSXdgLx…
G
Most of the other humans are trying to enslave the others in one way or another,…
ytc_UgxF7Iw1W…
G
Well now everyone should see why these big tech companies have several huge data…
ytc_UgxhvYMGZ…
G
Just one tiny problem and is the semi-conductor chips are not advancing that muc…
ytc_UgzCaHy5H…
G
I heard a story 3 years ago of an A.I. robot killing a Chinese scientist by a fi…
ytc_UgzS6-Lrp…
G
People are going to do a shocked Pikachu face when they realize there wouldn't h…
ytc_Ugwt3UZef…
G
You assume it needs to get an online connection. We're working on chips specific…
rdc_k7kaiui
G
To be honest here.
I don't think AI art is bad.
But it's when people pretend…
ytc_UgzRZXM_R…
Comment
I tend to agree more with Elon. If the tech can't be seamlessly integrated, none of it matters. I consider Tesla autopilot to be pretty safe. I can only imagine it becoming safer. The actual radar sensor probably wasn't all that expensive. I'm imagining the autopilot engineers were not utilizing it in favor of vision already and it only made sense to axe it. Maybe lidar or radar could somewhat improve safety at the moment, but the self driving car of the future doesn't have either. The self driving car of the future is what Elon is trying to make. And, that is what will both make Elon richer and the safest cars / roads.
youtube
AI Harm Incident
2022-09-03T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxX-Ayn7yUJznfc0t54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-PlPlFxZMlrKl7L14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzvDnbzXfa5FBoo7Zl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwgOExhv5yZa_XdtpB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyoednUGu41_yo24md4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwANF7DrFOhKxikA-R4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz8e0l4Nu2l9bYutH94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxG7tG4EnJHJkTtht4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzErz4Be_uJzuUAsAh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzncrNhH5J6ZBBj3N94AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]