Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We are going to come to a point where there will be a human with such incredible…
ytc_UgwHUbyo1…
G
just use AI, forget social approval. its not just 'typing' its you+computer =mak…
ytr_UgztEvEbG…
G
it's lunatics like this, that spread the socially acceptable norm on new evil cr…
ytc_UgxGTCpO0…
G
I might be late but ims make it and it will say fuck nfts and ai…
ytc_UgzrUVBe8…
G
No way...I would never of guessed. Although nowadays you could say this is not a…
ytc_UgxlwF6CV…
G
That's the most common and intuitive objection, and it's a reasonable one. But t…
ytr_Ugw0R9Mf0…
G
True story : ChaGPT couldn't even figure out how to split a <div> in html after …
ytc_Ugxt_0kgK…
G
@saifcode007 appreciate your feedback. LLM’s can’t provide a confidence value or…
ytr_UgwgK26Fk…
Comment
If this George guy was putting his foot onto the gas pedal to speed the car up, doesn’t that override automatic braking? While in cruise control, my car (Kia) will automatically start braking if there is something in front of the car detected ahead. However, if my foot is on the gas the car will not engage the brakes. If I’m searching for my phone and not looking at the road while accelerating the car, I will get into a crash. I don’t know how Tesla handles this case, but frankly I’m wondering if their cars are similar in this way of letting human control override automated actions
youtube
AI Harm Incident
2025-08-18T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzvehiUbm6ZoeGSf0l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCWLWkswXzinpuYZt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzePyt56InoJDl7xwB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9GfauASldXBCv4lJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOGfBQHTwO-womX_B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzEMXWUpu2l55awFhZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy1bPm0jSBenlUI5tx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxndAFJgjg0A_TyW914AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxpVdG97-rFkkTreyJ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyACWcyPr6A4MZZ5Rp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]