Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have watched enough movies and TV shows about self-driving homicidal vehicles …
ytc_UgxUAvSTb…
G
I find it funny that some AI bros talk about how "artists are lazy, AI is the fu…
ytc_UgxpIhjTR…
G
10 times out of 10 these people just dont know how Ai works or artists…
ytc_Ugwvkt9p-…
G
The sad ass part is that some people use AI for good instead of evil but they ar…
ytc_Ugz7Vu2sR…
G
Doesn't matter if it said yes or no, there will be people arguing if that was co…
ytr_UgwuZTliD…
G
It is clearly fraud to sell AI pictures advertised as real art. The buyer needs…
ytc_UgxlfB1VF…
G
Does that mean AI generated aspects of video games and movies should also not be…
ytc_UgytPbidt…
G
At least they haven't yet thought that "typing prompts" makes them equal as writ…
ytc_UgxfuT1gA…
Comment
"Autopilot" is not the same as "Full Self-Driving". They are totally different software. People often don't distinguish the two. A 2019 Tesla Model X - which is the subject of this incident - is obsolete and it's capabilities are far interior to a Tesla of 2025. The car is a computer and becomes obsolete in a couple years. The camera also seems blurred by dirty glass and gross negligence by the driver. This is driver error, not equipment error. The driver should be liable as it is the driver who's in control of and has custody of the car. The car did not on its own, start driving absent the behest of the driver.
youtube
AI Harm Incident
2025-08-05T03:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxndwlX-msSGFFXV9N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpgKed5H-sGVoGpxJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzawCv9AZ4i5_fSyb14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYUMyAYfbnSIeTdqJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzmPvF4stYa7OsCz054AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhcvIY7aLONXPvLIJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwlgP59r_Q14rAVqcR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0cWM6Do8Do3_PW0J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwWmr4QRheAtxVRic94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4t6_xl72BjqctnjB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]