Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
great job on this AI Apocalypse video. you know, you made me start crying. thank…
ytc_UgwxDZauB…
G
You think AI will turn against humanity? LOL, AI is a weapon, just like a gun it…
ytc_UgyvTX1oH…
G
White man locked out of home because Amazon AI considered him racist while using…
ytc_Ugx4QPY9E…
G
@SugarSlimePG3DThe average is 30 bucks.
One bad apple doesn’t ruin …
ytr_UgzmtMssW…
G
Surely if it gets too much we can just blow up the servers right🤷🏻♂️. The mainf…
ytc_Ugz1HGCAf…
G
You can use ai and can produce quality work in a short amount of time.…
ytc_UgwIY2omx…
G
Is it really a self driving car if you have to keep your hands on the steering w…
rdc_e145vca
G
The big problem is IT ISN'T AI. Computers have not achieved true AI only emulati…
ytc_Ugwyq0h2l…
Comment
I fully agree with shared liability between the driver and Tesla. Anyone operating a vehicle needs to have full awareness and attention (where I live, this also applies to people supervising novice drivers pre-driving test), but also Tesla should absolutely not be selling very primitive (relative to human ability) autonomy technology as "full self driving," beta or no beta. In a world where people are already beyond complacent in their responsibility when driving, we don't need manufacturers making false inferences to their customers that their vehicles can manage on their own when they patently can't.
youtube
AI Harm Incident
2025-08-15T18:5…
♥ 56
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugxv1KqFmv4nmdmGM5F4AaABAg.ALrCgWHmHROALrHOVnSTJr","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwK9BAnYJiStXntL0l4AaABAg.ALrCKN4hqx7ALrK2GZsLio","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugy-wc-6U2A1Jl0St4F4AaABAg.ALrCFYlp7hUALreadi6edv","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxZ2O-k-Jju9XGNmZt4AaABAg.ALrCCJsSnifALrF1Tmd7mm","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"approval"},
{"id":"ytr_UgxZ2O-k-Jju9XGNmZt4AaABAg.ALrCCJsSnifALrFPHs7Wh3","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytr_UgwRryc8nBP0waW8VAV4AaABAg.ALrBo0rFly3ALrLfGT7u4s","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxPgSpdpfppnKSF_OB4AaABAg.ALrBjX9f_WsALrCspgJHwT","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgxPgSpdpfppnKSF_OB4AaABAg.ALrBjX9f_WsALrGLXizNfN","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzmoUT3xNeLGltDtPh4AaABAg.ALrBVRvYljKALrDznthT9u","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzmoUT3xNeLGltDtPh4AaABAg.ALrBVRvYljKALrEntAcpZY","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]