Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Elephant in the Room, that no one will say publicly in fear of being elitist…
ytc_UgwEL5z9o…
G
Global collapse. System collapse. If we go down, let's take them with us (edit: …
ytc_UgwV6tQPJ…
G
I continue to see the possibilities of AI downplayed by “experts” but then I als…
ytc_UgyUb6KuJ…
G
I sometimes use AI to create something and then I redraw it. This I would consid…
ytc_UgwF2Mt1Q…
G
2:35 that’s ridiculous Eldon ring is distinct from dark souls or any other form …
ytc_UgxrvZ_9v…
G
This is fascinating. I have a book started that is 10 chapters long so far (may…
rdc_jdjqmqo
G
This is every art snob's favorite bullshit argument. You romanticize the idea o…
ytr_Ugzqu1OTh…
G
As a machine learning enginner i will give this answer 3/10 as "machine learning…
ytc_Ugyy2jS8o…
Comment
In the chemical engineering world, we have an organization called the Chemical Safety Board (CSB). One of their tasks is to investigate the cause of major chemical incidents, notable cases being Bhopal and Three Mile. The findings are shared online for anyone to view.
It seems like something similar would be extremely beneficial for autonomous vehicle safety.
youtube
AI Harm Incident
2024-12-27T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxb3j5LNph5-Axj8wJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2nbJizJfG-FXadXV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxssZXuYG9XmyRbftl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxUh0QQ43e_mpgGs4F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyklKdwJy5yI8OziQh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwrBep8iVLogY7VL_t4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyCRa1nNcINqSLNogF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxgzKFLZ1UL1tB7cF94AaABAg","responsibility":"user","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxo3_VluY300m7WmEl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxOTHJqZAxYBAxsFzp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]