Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i feel like ai art will become the sort of fast food of the art community, and h…
ytc_UgxsNhgPv…
G
The issue I know of with diagnostic AI is that they are a "black box" program—it…
ytc_Ugx0sKkHQ…
G
guess what .. vibe coding is just eating without learning how to cook.
Assisted…
ytc_Ugx_5x0UX…
G
They are lying to you about chatGPT.
I always asked myself how zioni$t§ (& hind…
ytr_UgxnMr0aW…
G
Humans having AI isn't the issue. It's Billionaires having AI because those are …
ytr_Ugy1ygeWn…
G
I'm going to have a Boston dynamics robot compete in the Boston Marathon for me.…
ytc_UgybkCH2L…
G
We've been a primary production and resourced based economy for a long time now,…
rdc_da419ep
G
that wasnt AI 🤦♂️ stop being so gullible. Watch it again, at no point did they …
ytr_UgzgTs51W…
Comment
I study robotics and AI at university. So how this autopilot works is that the AI model essentially memorises photos using pretty straightforward linear algebra, and most of these photos come from stock images. Images of cats, trees, people etc. Saying an AI has 'learnt' generally means the model is usually 80% accurate with general everyday cases. So no niche circumstance like an overturned truck will never be picked up by an AI because how rare it is in the images of trucks the AI was shown. If we force the model to pick up on niche cases, like an overturned truck, it will then completely mess up the maths it uses to figure out simple things, including a normal truck, making accidents a 100% certainty
youtube
AI Harm Incident
2025-04-30T13:0…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxR9I6PNNLzQleDQGN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwCVe-uLqHzpaQwRyd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgweWosgEdBDbM1J9t54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyVzcrXsyuBHAsJZIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwztLjEZzLc6Zu7oxp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzNVC1ydO0c-soNTr14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3JtDkh1o42myOp4d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwskBsLDHm3rIbREOF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwv_5nV_2Z909Ro70h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxyE2r3cdmW6pb2iWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]