Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What ever art you make is the result of your brain mixing memory, lets say we h…
ytc_UgyGlqN7U…
G
Chinese companies actually are going to have the biggest ethics teams. The only …
rdc_jcfh17h
G
„There is no turning back now on the topic of AI, especially in art. The genie i…
ytr_Ugznc-ece…
G
CHATGPT ARE TO MUCH BETTER THAN GEMEINI. GEMINI USING BY POORS AND CHATGPT USING…
ytc_UgxntzVTq…
G
We get what we wish for and now realised it may not be the smartest thing to hav…
ytc_UgzdllF71…
G
I hate these scammers trying to sell you on AI taking over is going to lead to U…
ytc_UgxT4eiYE…
G
Holy Crap !!!!
The AI will Constantly Scale in Intelligence and Productively put…
ytc_Ugy5-axy1…
G
None of us will be in tech 4 years from now. It is a rapidly evolving landscape …
rdc_ohn0f79
Comment
okay this has already been solved, not 100% but damn near close. The self driving cars have accident predictions. Also the car would never put its self in a dangerous situiation such as following to closely to any vehical. These cars are very aware. It would not choose to hit any other car, no mattet the situation they are designed to avoid millions of different obstacles. This video seems to have been made out of thought instead of facts.
youtube
AI Harm Incident
2017-06-23T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjp-pcf8PXx8HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiIjTXIJ5B6wHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgicjfJMB8sNk3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugghpi3fGQwm63gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghvDWbrWZGpnngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugjd8L8jNpzFtHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghC4VdT2HTcDHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugg4Bv06oKkXP3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghZnL5q_KLZvHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggPbviiDUEtwHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]