Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anyone who loves reading and writing, who loves words, wordplay, the elaboration…
ytr_UgxGc3a0F…
G
Okay, master. But nobody has a job. Nobody is hiring, or at least hiring is decr…
ytc_UgyHfmoGZ…
G
1:03:47 How LLMs generate prose, basically a really complicated autocomplete, it…
ytc_Ugw4VTSPV…
G
Well that was terrifying lol. Anyone else kinda feel like this is how the simul…
ytc_UgyBQI1tT…
G
While I am not a fan of private companies being able to use facial recognition f…
rdc_enjsvfs
G
I love the Halloween cow drawing! It's ironic because I'm going to be a cow for …
ytc_UgzyPBXmW…
G
I 100% support AI
As an Accelerationist the faster we get to the other side of …
ytc_UgxVLuM4g…
G
I use AI to make my characters because I can't draw, and I don't have any money …
ytc_UgwO1NL04…
Comment
the car must put the life of its passengers first.
Seriously? If the car is driven by a human and he ended up killing someone to save his life, NO ONE blames him. But when an AI or complex programming dares to save the lives of its owners by swerving and killing another person, you people start yelling ETHICS? Its not the AI's fault that some dumbass is too incompetent to tie up a truck's cargo properly.
Dont let an AI drive if you're not gonna give it the same driving rights as a human driver. If human drivers are given the right to hit or kill another person to save his life, an AI should have the right to do the same to save the lives of its passengers.
youtube
AI Harm Incident
2015-12-13T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjvJ6NnfmEbp3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggXZfa6C2KKR3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghNp5BGhWiGfHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugicy25a1_k_VngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggVXqoniLKpUngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi6d151MupypngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughowk26EsgP-ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjPUX-D7spJtngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiXjYc4IT9HpHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugh_gj4KR5_ORngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]