Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Silicon Valley guy telling an audience of Silicon Valley people that they need t…
ytc_Ugyj0P0Gk…
G
If we become educated about social policies that can help people live healthy, p…
ytc_Ugy57Vdvx…
G
I ask ChatGPT to explain my same experience with him in English because I’m not …
ytc_UgzD1hmuA…
G
How the market even real?
Like a shoe company can just declare themselves an A…
rdc_ogpz9t9
G
Maybe intelligent machines, as they evolved, would value love higher than humans…
rdc_cthtz9j
G
That’s the biggest bubble. They went on the capital instead of the research. Tha…
ytc_Ugz6S8lr1…
G
As a small artist, even if I'm small, this still concerns me and as improve, it …
ytc_UgyYnQ65Y…
G
Its crazy to think a driverless semi can drive on the roads without doing pre tr…
ytc_Ugx_pI-LO…
Comment
Sadest thing is, they know autopilot is flawed and possibly gonna kill more motorcyclists (or bikers or similar traffic members) yet taking out radar due to savings. Since the numbers show this flawed AI still generates less accidents than men they feel excused. Couple lives makes no difference. Goverments should not allow this to happen until AI prooves 0% accident rates. Even 1 life taken is still too much.
youtube
AI Harm Incident
2022-09-03T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwpnoCLAao9Wg8qJat4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6jLbUlRnvCYxwAFd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwYcy5jEGjYbZ4EIE54AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwKovd3cuzR-B4BXqx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJIWeXj94MSc_NAqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw-VeJiSSBXNnT-LnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxNf1hwXOiok8_s_a94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy4RsqwjHZjIjdr0UN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyj7J2UQOZfQSmnnNN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3xHhheB1MwNJgIYh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]