Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI companies aiming for super intelligence is like a publishing company trying t…
ytc_UgzU1rD9u…
G
I find it so ridiculous that the 'AI artists' use time saving as main Argument..…
ytc_Ugw36jwiG…
G
When the stock market crashes, worldwide internet servers suddenly become encryp…
ytc_UgxY04Bcj…
G
I think AI would work as an assistant to human doctors in reaching a quick fix …
ytc_UgypRT-16…
G
AI and Robotics will advance to the point that we won't be needed any longer...…
ytc_UgymTWZsc…
G
We as a human race have always caused our own demise, always wanting life to be …
ytc_UgylE-Rx3…
G
These chat bots sound so demonic. The people that created this technology need t…
ytc_Ugzz-wpWk…
G
photorealistic art was basically never art to begin with, so hopefully with AI a…
ytc_Ugx9R-XRt…
Comment
LiDar would prevent this.
All it needs to do is send a warning to the AI to run a deeper re-check when anomaly is detected. This would prevent that random ghost panic breaking.
Lidar can be cheap, just a basic low-res scan of the road ahead, that's ALL THAT'S REQUIRED! not a full 3D panoramic scan and virtual reconstruction and comparison with prerecorded 3D scans, that's ridiculous.
But what do I know!
The truth may be that Elon may be leaving a golden opportunity for a competitor to implement lidar and solve autonomy, and what a fool he would be! We shall see..
youtube
AI Harm Incident
2022-09-08T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgziLujQpqV31lq0C114AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugyipo8Fi7N_pspGiaJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx8mio1G7v9yWacwPR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwWCfeJx3vcws9ylqB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxUHhtSQWyMz1Ubrcl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugws_O5Tmy1dM13uyW54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxD_G9hZkBIUERTTkp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzD4eYO40qE0xJhehJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzCdUKow5rs00K2kHF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzsJkNMMsiZao4LUjV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]