Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Calling yourself an ai """"""""artist"""""""""" is like seeing someone walk and …
ytc_UgysoChRp…
G
This video is AI generated. It ignores that the financial outlook for AI is ble…
ytc_UgwO__VVQ…
G
There are always wicked or ignorant people who don't care about the negative con…
ytc_UgwJS2psW…
G
If ai was used for good, to help with healthcare and poverty, increasing housing…
ytc_Ugy8mTX3X…
G
Don’t forget about those of us who are disabled. We deserve to be part of the be…
ytc_UgxkLj_q3…
G
Remember: AI is programmed by people. They know what they're "teaching" it. They…
ytc_Ugz6cCovN…
G
LaMDA is NOT an AI. It's not doing ANY reasoning. It's LITERALLY performing pat…
ytc_UgzDRwQHW…
G
Are you an artist? Just an fyi, Blender does have Ai in it. Blender is also a gr…
ytr_Ugyr9sNOk…
Comment
Can I spot a fatal flaw in Tesla's autopilot? Yes, "Tesla's autopilot"! I think this self-driving car bullshit is simply insane. It's like indiscriminately launching a 2000 pound chunk of metal down the freeway & hoping that it doesn't hit/kill anyone before it reaches its destination. And that's just ONE self-driving car. San Francisco has gone nuts with Waymo, and look at how many problems there are with those things. I don't understand how the government allows this shite. Are they trying to kill us all off, to control population numbers or something? Anyway, thanks @FortNine for the informative video =)
youtube
AI Harm Incident
2025-05-20T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzJNEfM7JIy-4BtlW14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxRJCvVT1K5vjJxgDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugz8bEyHppyqCBtTBAZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxkSu6gmjJwioD6L6V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgwBIjBmfm8Fi1kpxxt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgztTBd7rKadrNPR1iZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxY9QxyWSaPcBhfCtZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyKbfNwwkRWeQb1Pg14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwfkwutGmdbAljavg54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},{"id":"ytc_UgwDXYK0ayz9e7s_i0p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]