Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
US: Join our AI coding team!
\*2 years later\*
US: We are firing all the coder…
rdc_k8s06nk
G
Hot Take: both extreme sides of this argument are wrong.
Charlie is right: Any…
ytc_Ugx9TdPCc…
G
Art is a way creative people express an idea they have. Prompting ai isn’t art b…
ytc_Ugy31Hiui…
G
art is nothing more than ramachandrans yellow striped beak.
the artist is one w…
ytc_UgyeRKwcf…
G
A million AI models writing a million books at a time will take a lot less than …
rdc_lz5rwu4
G
interesting how talking to an AI is kinda like talking to someone without real e…
ytc_Ugx8ZkEjN…
G
So horrible wrong that tech companies are increasing AI investments up to 200 bi…
ytc_Ugyg_tZQu…
G
Even if it did start to happen, couldn’t we just get rid of the internet. It wou…
ytc_UgzW71ADw…
Comment
Back when Google began developing self-driving cars, they said that the development is the easy part but that once the technology rolls out and the first people die in crashes that were not magically prevented by ai replacing the human component in only some vehicles, that would be when the actual battle begins and it gets really expensive to defend the technology against those who believe that a hundred deaths caused by self driving cars are much worse than ten thousand deaths caused by only human failure. Now, suddenly it starts to make sense why they made Elon so extremely rich. I guess that way he can survive the time until society accepts that even if not 100% failproof, self driving cars are actually a lot safer on average. Fascinating!
youtube
AI Harm Incident
2025-10-19T16:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwV5UQXPw2H4R9Uzqt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwKR2UGzjgsCR85Oal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnQ3HCy1z-6qClJq14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXgHJvTtHvLLYiIb54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz-hsjLkvAau0I8DlZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugysg6rwsyzaIoJhnu14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxE0nyxr64eRE3ZhQl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwloWy2D0uUw94JxyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyQhUzoOQhvb6E4Uv14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx1jTYI9x8D9xXm5Ml4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}
]