Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
>Response: The selection algorithm can be designed so as to ensure the mainte…
rdc_ci1ises
G
I think the most ironic part of this persons argument is that they want to use a…
ytc_UgwM0Jtd1…
G
Without human doing the work; human will have no income to purchase companies pr…
ytc_UgxR4eMt0…
G
Your AI tech isn't that great then as they've blown up the whole of Gaza. They h…
ytc_UgygpjtHe…
G
People need to stop calling them artist, the only creative part they do is think…
ytc_Ugw9kvyeB…
G
Once the C-suite figures out that AI can replace them, that will slow the fundin…
ytc_UgwVabwYh…
G
AI does not experience the world like we do. We feel things because of biochemic…
ytc_UgzwutNu7…
G
A person I was in group therapy with not just used ChatGPT daily but let it deci…
ytc_UgymCddPj…
Comment
Y'all dumb u know how many deer I've dodged that jumped out in pitch black... for y'all to say I couldn't see her til last second ur looking through a camera..... the problem with these self driving cars is they can fail. if u put ur life in the hands of a robot/ai/anything mechanical ur an idiot. Would u point a gun with the safety on at ur head and squeeze the trigger. No because the safety can fail.. Just Stop being lazy and drive the damn car or get off the road.. the shit that bugs me is that he is completely not paying attention and he's in a 2000 lb chunk of metal going fast enough to kill.. she was dumb for walking in the road but if u were driving like u have some responsibilty and don't stare off into la la land while your driving this might have gone another way.
youtube
AI Harm Incident
2019-03-01T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx32fVgS5KzwyL64-N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx_RWNhAwU6n_eqWsx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwVQR4GtvLJs2-twNR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySzCvT9LUSDSl42E54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwCOTVPxQyOp-bxpgt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwNJxvCeNdOqJKVpX14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxCUx0iDWprQOYImJh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyk0fzQ9_a-ZFW4HIF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz3QJ8zfxhGqf00TPl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyzV4MXwTAma-38zGx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]