Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Art has always been about the name. Its always been about who created it. I can …
ytc_UgwkdB7MQ…
G
For my own reference, what gives it away? Not that I'm trying to hide it, I'm ju…
ytr_Ugy3is-FJ…
G
@Avian_slime??? Not sure what you mean about the unique name thing. However, L…
ytr_UgybeLuef…
G
This video sheds light on a critical issue in AI hiring. It reminds me of how im…
ytc_UgxGW3k-Q…
G
"Don't they have to prove" - Ai bros trying to defend stealing copyrighted mater…
ytr_UgyCr_Jm3…
G
You give him too much credit. If it hadn't been him it would have been someone e…
ytr_UgzoQ5Dri…
G
To my opinion future generations will suffer. Is not about building or developin…
ytr_UgzI4Ctu_…
G
The Amish will not be affected by AI. During the Covid crisis, they never stood …
ytc_UgwcNV2XP…
Comment
I know I've come to this video late, but I wonder if another potential way of breaking the tail light assumption would be to have one of those helmet-mounted rear lights like the Brake Free. Would a third light be enough to convince the AI that it isn't a far off car? Or if you're a short enough rider or in a low enough riding position, would it just continue to think the same, assuming that the third light is just the LED brake light that exists at the top of most cars rear windows? It might at least make it think that the far off car is braking, but if it thinks that it's far enough off, that might not actually change the AI's decision making any.
youtube
AI Harm Incident
2025-08-20T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxwwikthp6FhSlvY3h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzv23JEgPCM8GG40zF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7uwgSTAYN3gGihet4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwhvBRCjU6kRnPWjpt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRdvXW334LWfme2kl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNwm-sIm5BdnnZgqt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy1JEbwJbC7k4Ct8pV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmUbIEEAhmOmdLFbF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyrDmyEThIMTXFgrxd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzfChESkctwF8zzrLZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]