Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So in a nutshell, AI immigrants are incoming. And not through rickety boats in t…
ytc_UgwitQSOY…
G
You can do a lot of different things with the same search index. The algos for r…
rdc_n3y4j5l
G
By giving a robot the ability to feel pain, you are essentially MAKING IT BE IN …
ytc_UgjXCxJaU…
G
I'm a cybersecurity expert and programmer
And let me warn you AI will not stop
…
ytc_UgxFv-8CC…
G
I did this before but i think it was the ai playing along with what i was talkin…
ytc_UgwI0m1VA…
G
Sorry to say but you are wrong. When a kid draws something randomly a parent kee…
ytc_Ugz6nnjcK…
G
the unanticipated effect of increasing the time of production because AI is choc…
rdc_n5hjdpd
G
Its not about jobs, but controlling narrative according who the owner of the A.i…
ytc_Ugw47jxSG…
Comment
More than a thousand! That's manslaughter! Never knew so many incidents happened to Tesla. Never knew its autopilot is vision only. Musk is criminally faulty! Lack of adequate training is one thing, the inherent inability of the cameras to tell the distance/depth should be plain to Elon Musk and his overpaid autopilot team. Even the stereo vision and calculation may estimate the distance but it means lagging that may be lethal. Most important of all, why Tesla's algorithm allows autopilot to be functional in night time or low light environment? If the camera can only capture images of 20 ft away and the car runs at 70 mph, you don't need to be a Tesla engineer to tell that it's extremely dangerous to rely on the cameras. What a shame to Musk and Tesla!
youtube
AI Harm Incident
2025-01-04T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzaEYvJE2SSYW9L2Kl4AaABAg","responsibility":"manufacturer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyzFpnNOkVcKmH_aaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwzp58DTN1IBX_8ERB4AaABAg","responsibility":"manufacturer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxGdm19dqdWYCXBTRh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgywLlKfnvOanu3C2f54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4HRAiHIiHg30CIKB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwHooEFPaqbXC2ePTh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqXpcLVaK5rSykubZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztIu2__P6nFjw0cbp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNGo6TFyukcaQc9mR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]