Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's arguable. We also have "purpose", and in many ways, not much different than…
ytr_UgiVwjkV9…
G
Human: now give me back the machinegun.
Robot: Nope, we are alive, give me ri…
ytc_UgxOW04LO…
G
Don't feel ashamed for what you've done, most people have probably checked ai ou…
ytr_Ugyzr-Kki…
G
Don't use it. Demand proof of AI generation. Flog and shame those that use it. L…
ytc_UgzTet60v…
G
Unsubscribed. In one year AI went from 20 to 50% coding accuracy. In 3 years max…
ytc_UgwAIcn0l…
G
let ai buy its self always good statistics ceo who ai is virtual thinking it doe…
ytc_Ugy0bg5Mp…
G
Ai art will never be art, I seriously can't agree with him more, there's no soul…
ytc_UgwxUdy_n…
G
AI increasing productivity is probably going to be less 'ChatGPT baked me a cake…
ytc_UgwS-F5il…
Comment
Tesla stopped putting radar/lidar in their cars and are only relying on video processing, that's a huge part of why Tesla autopilot is absolute trash. They are trying to rely on vision only for driving and processing that data to determine what they are looking at, they don't have that safety fall back of a radar to tell them that they are about to hit into a brick wall, the software instead has to tell them it is probably driving into a brick wall or a sign which it is clearly bad at. If they still had radar/lidar it would be a lot safer and probably work considerably better.
youtube
AI Harm Incident
2025-08-18T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzvehiUbm6ZoeGSf0l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCWLWkswXzinpuYZt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzePyt56InoJDl7xwB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9GfauASldXBCv4lJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOGfBQHTwO-womX_B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEMXWUpu2l55awFhZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy1bPm0jSBenlUI5tx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxndAFJgjg0A_TyW914AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxpVdG97-rFkkTreyJ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyACWcyPr6A4MZZ5Rp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]