Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Idk why this keeps getting shared over and over for like the last year, lol. It’…
rdc_lbhxbfu
G
We are literally having AI do everything fun that there is to do in this world f…
ytc_UgwvtFI1d…
G
AI "Facts" do not care what you "feel". THe judge and jury will also soon be AI.…
ytr_Ugws2h9tu…
G
How are these being addressed? Does Aura also stops these AI threats? It's reall…
ytc_Ugx4OpDit…
G
real breaktrough in science and productivity means that in 5y ddr5 memory gonna …
ytc_UgxFtoJsz…
G
So that's how we will be killed by ChatGPT mother of AI
In year: 2089…
ytc_UgzC_YhHk…
G
Get in line ladies. In another decade, men will settle for the compliant, cheape…
ytc_UgyfcuH1M…
G
Most of the technology you use is _designed_ to make you miserable and keep you …
ytr_Ugwv1Rq6s…
Comment
Lets say we agree that autonomous car will be a likely thing happening in near future. Will it be govern by a single authorization body for the whole world? I would say it is not likely for geopolitical and geographical restrictions so I would foresee a combination of algorithm favoring different scenarios as discuss above. Would it be a single solution out of this dilemma? most probably not. Autonomous car companies will more likely have their own version of the algorithm for this and would vary from one to another. So lets give the buyer of the car to decide what and how do they want their car be. Until one day when every single vehicle on earth is fully autonomous, then there will be less likely this scenario will happen, while if this really happened the other autonomous car can also instantly react by giving more way for them to escape the incoming impact.
youtube
AI Harm Incident
2017-01-13T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgiDRHNP6Ll3F3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiIYchWvUGckHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjiR5ifVgu5L3gCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgiJaxBMly9MvXgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghSuhCsL9iAHXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghAA7dcebmab3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UghAsDUNhcPf4XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi_4HU5JSF7SngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugiufh1PTT6cmXgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UghXJhtXibHvFXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]