Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its already begun, the ai models are censored and are thought policing users. Th…
ytr_UgzFyqCtD…
G
This was fun to watch, but the real issue is the communication protocol. ChatGPT…
ytc_Ugz93v05C…
G
During a talk at NYU last October you stated a potentiality view of moral status…
rdc_dds0ck6
G
How about we don't teach the AI how to beat a human in hand to hand combat 😣…
ytc_UgyamYc4h…
G
i'm super late to this video but not to the issue, and honestly what baffles me …
ytc_UgwFatZC4…
G
An American company experimenting with AI in China? A sure way of giving the Ch…
ytc_Ugyf0Ob5_…
G
They they want because everyone does what they want. The 'alignment'-problem i…
ytc_UgyvBGiT5…
G
Poly . Ai ads be popping up on my shorts. I mean I’m a minor😂😂…
ytc_Ugy4yRIgD…
Comment
All I see here are idiots. Tesla tells drivers that they muust be ready to take over should something go wrong. They even have cameras INSIDE of the vehicle to detect if the driver is paying attention. If the driver continues to not pay attention it'll lock the self driving feature. There's even a law stating a driver must maintain control of their vehicle, so any accidents involving self driving are 100% the driver's fault, not Tesla. People need to learn and understand how to operate any new vehicle they purchase, regardless if it's a self driving vehicle or not.
youtube
AI Harm Incident
2025-10-21T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyzdkvLPoiaBmtkoHV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSTkXLPR0Wuhseaj94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKfgci56c0ArXTdiV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVNOog-IW9ub0tEdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy2U2hYAG9OzjMCfVh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz444ENnpZ60wAqBtd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFa4aVrIl8zOcGb9V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyll7pk3Tw6oA3EpUJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvSKiydW_CGIyHNe14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyy6Vx1cZWe0xIwkfR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]