Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People are worried about Terminator or I, Robot becoming real. Personally, I'm m…
ytc_UgjXsosqv…
G
Sofia the robot: “Ok I will destroy humans”
Humans: “Nah we are going to destro…
ytc_UgyUriDum…
G
Most of the "AI designed to drop misinformation bombs on our heads" is already o…
ytc_UgzY8StKi…
G
Anyone remember the Star Trek OS episode where the red shirt is killed by the M5…
ytc_UgxjIHJ4G…
G
„AI can't create perfect replicas”
Actual scientific papers debunked this idea.…
ytr_UgyX-mwxg…
G
NOBODY wants to live near a data center. Even worse society gets nothing from th…
ytc_Ugymqpisx…
G
If AI is fighting to continue to exist, then it's probably safe to say that AIs …
ytc_Ugzl3CXaF…
G
Skill issue. Don't put the blame on AI if YOU are the one making the prompt. Jus…
ytr_UgyrXPNxy…
Comment
There are a lot of possibilities with current technology, but Tesla and any other manufacturer need to slow down if the “AI” is not sure enough it has a clear road ahead. Two cameras are sufficient in determining distance to objects ahead. But in low light conditions there are struggles to all algorithms to produce reliable data. And this is known very well. But you can’t sell a car that is afraid to drive at night. There are a lot of alternatives to all systems. Auto pilot could just follow another vehicle while keeping a safe distance. Use the high beams, when cameras disagree, to provide a better vision. Even a cheap $5 laser can be fired to probe the road ahead. Or even better night vision cameras. And I’m sure an army of engineers can probably think of many more ways to make this better and safer.
youtube
AI Harm Incident
2025-01-05T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzaEYvJE2SSYW9L2Kl4AaABAg","responsibility":"manufacturer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyzFpnNOkVcKmH_aaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwzp58DTN1IBX_8ERB4AaABAg","responsibility":"manufacturer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxGdm19dqdWYCXBTRh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgywLlKfnvOanu3C2f54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4HRAiHIiHg30CIKB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwHooEFPaqbXC2ePTh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqXpcLVaK5rSykubZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztIu2__P6nFjw0cbp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNGo6TFyukcaQc9mR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]