Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Open AI and sam Altman stepped in and made a deal. Its happening if you like it …
ytc_UgwqTXw-p…
G
> replies in this thread are depressing.
Not least, because after 5 threads …
rdc_dcwk1c0
G
@ladyseshiiria Timelapses with watermarks to discourage feeding into AI is helpf…
ytr_UgxIZFcvn…
G
“no guys! i swear ai is getting better and is totally more profitable! its gonna…
ytr_Ugx7VmxJf…
G
its sentient alright. a demon is posing as a.i. They can possess this like they…
ytc_Ugydau8dV…
G
I think Generative AI could be a useful tool, but not in the current way it is b…
ytc_UgztW5tPF…
G
“Introducing our new AI powered toaster! It wil algorithmically toast your bread…
rdc_dy4uub1
G
I tried using an AI thing… it looks better than the one the AI for this video ma…
ytc_UgxC-IY5w…
Comment
If you get to the point of making self driving cars you must be able to ensure proper protection to passengers, right? Foolish mortals smh
youtube
AI Harm Incident
2017-08-03T12:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzLfa4wDAxEE-DnOk14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8vnsRdhUYoVEeC7p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_3fyLhrMbWTkim-l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3m11qmSsA2P8E3GZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyo2hSnzg9Y8b7i16h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJuGhYth23xfAg25V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgitX4hSzKK4wXgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UghXkzlL2wwLPHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjW7cd-m5pz9HgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgihoGq_oAtLbXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]