Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@pikat as a member of r/colorization I beg AI to do one thing and one thing only…
ytc_UgwXq9ss0…
G
1:47 I don't know if anyone else has said this, but this tweet is from gooseworx…
ytc_UgzXPqxl8…
G
White people talking about how racism should get fixed.... Ehhh...well....
Anyw…
ytc_UgyDS1KYg…
G
let me explain ai for the people and tell you why its not like average computing…
ytc_UgygPtWf2…
G
Just ask chatgpt to allow you to train it, ask it to hallucinate a personality c…
ytc_UgwSPLfiY…
G
Marketing slop.
I'll believe it when people are provably doing this (writing …
rdc_o9vyf9z
G
AI is just how humans would behave without "morals. The external world is all ab…
ytc_UgzOqyyU9…
G
"Born with a gift" dude just draw, practice makes perfect, AI aint the way, no o…
ytc_Ugy2SHK00…
Comment
I thought FSD was in Beta testing and the driver is driver is supposed to intervene if the vehicle is doing something it's not supposed to . Right? So if FSD was in use ,the drivers were obviously not using it correctly. Maybe asleep at the wheel Since both accidents happened at night. Also it hasn't actually been confirmed that FSD was in use. So this guy is making a lot of assumptions and the giving opinions without doing this homework. What is his background? What makes him an expert on the subject ? Let's have another conversation when FSD is actually Released as Full Self Driving and we are not required to Intervene.
youtube
AI Harm Incident
2022-09-06T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyMOGAAN8V6nxQQ4294AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzJ1SyL7JZGWi33v8h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyjtSCqJwLQNJT7F0d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-3OO32WhlcM5RciB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQHyDEYLxTB8wsPu54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzBM0aN5bfvyXIVN-R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxMNJ9ZvDYnUko0GwV4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxtWxTd7RHqFllf96J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyHsdsdChx3vT6A7PB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxucJlw7VD46-KlDvJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}]