Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So 1) devs are safe, cause ai agents are too dumb 2) regardless, most senior dev…
ytc_Ugzz43cUp…
G
A.i ia absolutely 1000x worse then any human. Can't even go off characteristics …
ytc_UgwO8Ep92…
G
Dr. Tyson, I admire how you make science accessible, but I think AI is different…
ytc_UgzpkSsVK…
G
Publicly available information shouldn’t get someone sued. Even the verbatim sta…
ytc_Ugx_s3REV…
G
Google search : - "when did the 1st man land on the moon please" blah blah "tha…
ytc_Ugwb6-Uzg…
G
I just wanna be an AI. I will just ask them to make me one of them. Having super…
ytc_UgxI1kz4d…
G
“Blue Blood “Tyranny” Woah there buddy we know you just learned those words but …
ytc_UgzJZZ-6b…
G
Imagine going to your tattoo artist with an idea and they just whip out the ol' …
ytc_UgzZCGBrE…
Comment
Hot take: Tesla should be *primarily* liable either way here. If you market forward collision avoidance, you call it "Autopilot", and still your car slams at high speed into another directly in front of it, then YOU failed. There are many other situations where it is not reasonable to assign blame to the automated system, but here, FEW CARS SOLD TODAY would have failed to employ automatic emergency braking. It's a standard feature on many cars, and on most cars it's implemented using RADAR NOT CAMERAS which removes the possibility that the car gets blinded by the sun or other ambient lighting conditions.
There are some incredibly challenging problems to solve in the self-driving car space, but this is not one of them.
youtube
AI Harm Incident
2025-08-06T04:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugyo3mXRav_3CSECWZ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_Ugy3BBAc_KNJ439IPc94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwodsvCx4BJ7VtbmqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwDitjuHVrecEzjMlV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyGGvz2XKCy4n0U3Ph4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgylPYAFdJ9ukkm62z94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgyOSSAgPG2IROmEfTt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugw19fLrldTHsqhuld94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgwoMr4xqdpiszTuEax4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgzSMNqBGZwXeD4YGgt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]