Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Please, think out side of the box. There is still time for people to upskill the…
ytc_UgyHSBHHf…
G
I do like the human made miku art however in the long run it doesn’t really solv…
ytc_Ugx7HELE3…
G
It's clear that the hyper intelligent ai deserve human respect. If humans should…
ytc_UgzfjHkJs…
G
istg i'm bout to start a conspiracy theory
imagine the aliens the US found and…
ytc_UgwExT8ZK…
G
@adoptedson4204 Thanks for the comment! I appreciate your support in making this…
ytr_UgziTrZuK…
G
Self-hosted, local LLMs folks.. therapy without uploading any of your info anywh…
ytc_Ugx13EHdt…
G
What I hate the most about generative ai is it give everything else close-ish ab…
ytc_Ugw2z0lNT…
G
should have kept working on that education then you would be creating the AI not…
ytc_UgwDOr6J2…
Comment
@AntiMasonic93 WHAT? The self driving car is involved in WAY less accidents than human drivers are. They are an order of magnitude better at reducing injuries. You want public safety to GO DOWN? Why do you not care about the public's safety? Are you just ignorant of how much safer self driving cars are?
youtube
AI Harm Incident
2026-01-30T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwmA2sRsKCZ-AgXQHZ4AaABAg.ASbSHJXiPfRASjukXDnu6b","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyGy_KpTtO95gJ3d-l4AaABAg.ASbPTvvu0vuAScGBx56TUl","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyGy_KpTtO95gJ3d-l4AaABAg.ASbPTvvu0vuAScGPmB_BdL","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxNoOcQ7NN4KfgLUVl4AaABAg.ASbNveR4G1qASbOaJD41yj","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx-npkpw92W0a3AEK14AaABAg.ASbNnmj_4dmAScxCDkWs1U","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgyNiRsEYKF8TJg8pct4AaABAg.ARQOvI2KRjcARfe5lA0bma","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugybs5u1yHoMP31Nl3d4AaABAg.ARPUy9G1WI-ARPehcsuIV6","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugybs5u1yHoMP31Nl3d4AaABAg.ARPUy9G1WI-ARPqEYW2YTj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwmcgP3oFdzwKu5FBV4AaABAg.ARPIq7ew1guARR_prvSTLt","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgygM6YFXjm5vR6gmVB4AaABAg.ARP4bOJ7VmTARPLNhDUZXK","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"}
]