Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Peter Thiel is running the country that is why this is happening. He thinks huma…
ytc_UgwJsWV0g…
G
It's sad to see someone like this loose connection with reality and just riff on…
ytc_UgzvNtFgJ…
G
How is it legal for the ai to be using your face/likeness for their business??…
ytc_UgzOePfIj…
G
Great work as always! Informative, and potentially terrifying. Anyone heard of R…
ytc_Ugx0NI9P_…
G
If we really need to spend massive environmental cost on AGI, I hope it could be…
ytc_UgyKBvN5g…
G
hmmm i dont think vibe coding will ever be viable, personally.
i think somethin…
ytc_UgzFt2_Xn…
G
Yeah that’s true you can only receive EI for so long over here, but it’s highly …
rdc_fn5i0oe
G
You cannot reach AGI by stuffing a trillion tokens of text into a GPU cluster.
…
ytc_UgzEVUDdO…
Comment
I stopped at the 33 second mark. that isn't a real danger a self driving car will ever face.
believe it or not, we have terrible reaction times to dangers in the road. the car would brake long before any human would, unless you were aware it was going to fall, in which case you would probably have slowed down ahead of time.
braking is the only solution. that and a self driving car shouldn't allow itself to be boxed in to begin with.
there's no fucking moral dilemma here, if less people die annually to car accidents because people have self driving cars, then it would be immoral to drive yourself instead of allowing the computer to do it.
youtube
AI Harm Incident
2015-12-10T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgiJaQs6F28eWHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggJ82QW9q6Yh3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghnlhSnEQZ0IngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggjXP4s7034gngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgjMw5uEv4uP13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggU7UUEmbYyYHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjNuOWAcDkP3HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgisvA4COAatfngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjTfq8djgy0rHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghuJ8ET5_X-j3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]