Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This isn't Kush. This is something they call the "zombie drug" also known as xyl…
ytc_UgyqUENRO…
G
Ikr, like what gives companies the right to take an artists work and use it to i…
ytr_Ugwht8RD9…
G
This interview sounds straight out of a sci fi movie intro based on an AI apocal…
ytc_UgwifLa5g…
G
Stop the insanity! Sanction the lords of the Silly Con valley of the dolls COERC…
ytc_UgzZTBtN5…
G
Put ankle weights on wheel. I have no issues with ppl sleeping if you can take a…
ytc_UgyfcX52v…
G
I love how A.i CAN be used, but I hate how it is currently being used, A.I shoul…
ytc_UgzHHZXeb…
G
ImAgine how F’ed we will be when Reparations for AI slavery comes around… humans…
ytc_UgzcsaPku…
G
the face off shows this has a long way to go. It is a good rubber face but the r…
ytc_UgxPPuc_4…
Comment
Idk why we let self driving cars be available to the public. They needed at least another decade of technology development and testing. I would say that since the other oncoming car had to slow down to let the self driving cars through was a bad judgement on the cars part. IMO a self driving car should be more efficient than a human, make better safer choices and not slow down the flow of traffic at all.
youtube
AI Harm Incident
2024-10-26T10:0…
♥ 51
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpF90llWjDivI2J0J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMp6Z2XvDIb2U-emF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugya5GXlvp467FUkxiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyUbvRQZRiRBB09rKp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwFTWsVGzNU18gJeCB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxxEbNfjqB8u4dKJnF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx96IC7qUXgmYNV0vF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw6dpYhK6KCr_ShUyF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw58tYJ-DQZkii-WQl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTPnka_AVbX7ssr314AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]