Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Achieving AI isn’t much more difficult. The tech is already there but 99% of the…
rdc_nc1w6sh
G
I understand your concern! The rapid development of AI can definitely feel overw…
ytr_UgzU0gLI_…
G
Calling urself a “ai artist” is like ordering a meal and calling yourself the ch…
ytc_UgyS7Ndan…
G
Ai is the system and image of the Beast system. The Anti-Christ is here, and Jes…
ytc_UgwJrFokk…
G
Imagine telling people in Gaza they are in a simulation - I only wish they were.…
ytc_UgzavK5lJ…
G
Ya you sho better be paying attention. Keep in mind of what happened in "I ROBOT…
ytc_UgyR_nGHA…
G
Universal basic income is communism.
If AI is going to get smarter than us all…
ytc_Ugy4CxAbp…
G
Student: Why does philosophy matter
Teacher: because if we don't teach AI phil…
ytc_UgxavtvJw…
Comment
I use Full-Self-Driving in my Model 3 a lot. I find it very reliable and a better driver than I am. I think we'll all be a lot safer when everyone is using self-driving cars, especially if they can communicate with each other. Although it is not perfect, there have been a time or two when I have had to take control. It's maybe 98% there, but that isn't good enough. The term "Autopilot" is confusing, it is not the same as full self driving. It's more akin to adaptive cruise control. All that said, this guy is crazy. He's consistently getting strikeouts, not paying any attention to the road, holding the "gas" pedal down while on autopilot. Without being on the jury and hearing all of the evidence, and the closest thing to law school I've attended is watching LegalEagle; I can say with certainty (and absolutely no sarcasm), while I don't think Tesla is completely blameless, I'd lay 95% of the blame on the driver.
youtube
AI Harm Incident
2025-08-17T13:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUvURAXPpt_LnbY6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9mnxS1OupQtXq9bF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxtiWOqFpInUS9L3PB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxV9xQvtQpFpyioxOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxt374a4jhPLUocwwp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzOQxNYoBpW0ClLqgF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwocRcvg5U9DkT4FK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgynGXyWljYnCbeX9EN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgybR10bTgx_lzaRWhV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy7L8oGz1H-x8rtS9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]