Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not sure if it is possible for an AI to access the deepest forms of creation…
ytc_UgxmCpOek…
G
I don't get self driving cars. Is it so horrible to drive? It's a fun toy but fo…
ytc_UgyOlIcC4…
G
In short, Bill Gates admits he knows no more than the rest of us about how sever…
ytc_UgyB55C2k…
G
mostly its called "incompetent sensor placement" like how someone caught a 500hp…
rdc_et6l6m8
G
the art is deadly, like mine (arthur CANT be copied by A.I no matter what)…
ytc_Ugyvm4quu…
G
Do not panic! Ask 2 of these AI entities to discuss religion, and soon they will…
ytc_UgwktViPM…
G
I think what the thread suggests is that while you have an understanding of the …
rdc_mjzj6jv
G
It's the same with illustration. I tried it a bit and I realized that for people…
ytc_UgxJW9RaD…
Comment
if ai is up to part why cant it make split second decisions overriding human behavior to avoid accidents and if its really good could also prevent others from incurring damages as well, where are the super cars of the future? oh i forgot ai was built by flawed human beings so it will always be flawed🧐
youtube
AI Harm Incident
2024-05-02T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwu90-aJBcfwMCOhlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzXZp1le5oB2aTaqcZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxhyXG8ZF9ZrAMriGJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxgXPRMR3nn-GTTB7R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyq9U-AKTBvDEVNr2J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxjOHd4N3USuWI4MpJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz_ESyqi3BaEtRe_ax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzsDkeIoiSNTbDO-mt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwHVa8izwwlppgKR_B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwiBmdRgmfTezehyfl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]