Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey chatbi just wanted to say, there is a site that lavendertowne used to poison…
ytc_Ugxwwikth…
G
got the feeling this youtube guy is a lobbyst for AI. Look at the streets of LA …
ytc_UgxrF2BfL…
G
There has already been technology that replaces entire workers, look at agricult…
rdc_j4x163o
G
The terrifying thing is, even with current models, it honestly doesn't even matt…
ytc_Ugy0yYThB…
G
Days are not too far when AI captured the earth n humans will be their servant.…
ytc_UgxC1OKm_…
G
it’s all cool and everything until they look out and see “ai observing ai observ…
ytc_Ugzv-iAip…
G
I started trying ChatGPT this past week....
Used it for supplemental assistance …
ytc_Ugy3wN2V9…
G
> *granted some of these art pieces made by AI does look nearly identical,*
It s…
ytr_UgyrxZue5…
Comment
I would say that such an accident is much more unlikely than described in the video. If _all_ cars are self-driving, there would be no reason to let them communicate with each other. In case of the accident described in the video, you could slow down the midlane ranging hundrets of meters back at an instant. Cars would be driving much closer to each other anyways, because they could accelerate and break virtually at the same time. Trafficjams would be a thing of the past - the entire que could just drive at 80 km/h.
youtube
AI Harm Incident
2015-12-08T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghSiRcVXA-3FHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg36gd_wQOCXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggzSEiGsQNLKngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghidMHZsCybB3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjzNTXzuzIxOngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UghfmsovrnUJPXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjQy7gtc5pA_XgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugio_pXgICTxCXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggKQCpjXBYZKXgCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgggitcG_CbrUXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}
]