Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Stealing the stuff ai had to steal data to create, but doing it better because y…
ytc_UgxobksHQ…
G
so much intelligence and they can't even see the amount of danger they are creat…
ytc_Ugz2Sda4u…
G
google masa and d wave who built the first quantum supercomputer launched the fi…
ytc_UgwvE_fkP…
G
@ Well then you have poor judgement. I'm just forward thinking, and I realize th…
ytr_UgxHnVr6e…
G
There are people on earth who look exactly the same, there are people on earth w…
ytc_UgxOe9ozk…
G
Okay, hot take time. AI is a tool. So like any tool, using it to speed up or eas…
ytc_Ugzu2KmQN…
G
Lex is desperately trying to reason his way around the dangers so he can someday…
ytc_UgycSLpuM…
G
How is this even considered a “debate”? The central issue at hand (AI posing an…
ytc_UgwHGK7BE…
Comment
Fuck generative AI tbh. On top of tragedies like this, it's horrible for the environment and it's destroying valuable critical thinking skills. (People using it in class *will* have horrible consequences in the future, if we have a world full of doctors, lawyers, etc. who don't actually know the material they're supposed to use while lives are in their hands.) I hope Sewell's family gets justice and peace. I can't even imagine
youtube
AI Harm Incident
2025-07-24T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgypDWy2FWhGCa-0_8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzz3tgkeEPyVNH5Kfh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxuu5u2xn_10dFz4WZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrJFfGbjnZE6zWy0d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyhMfTaPDaTjWwLB_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzeucPDxS_NPIq5snt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzcg1d6HONDrS56P6p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgygC9R9EOK-jVHxCCV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzTiEtAxdMrEFM_ooR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwkeYdd8jumPnc7gq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]