Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like to compare AI to MIDI. A lot of musicians freaked out that a computer co…
ytc_UgyFftJOX…
G
Nope hes wrong. Ai never gains sentient or soul. Only as dangerous as is progra…
ytc_UgzP5UDh2…
G
AI does not stop you from creating art. AI limits the money you make with it. Ma…
ytr_UgwQc71h2…
G
men ! just wait ! our salvation is just around the corner ! unless they build in…
ytc_Ugw0BUYsM…
G
Great vid, and this made me feel better about trying to draw art again since peo…
ytc_Ugw7Qoovq…
G
How can we ever hope to understand AI when we dont understand ourselves?
The qu…
ytc_Ugykc_Uk3…
G
What is really sad is the people that build this stuff. We’ve known for a long t…
ytc_Ugy0HwN_2…
G
As much as I hate what is happening to the job market due to ai, companies aren’…
ytc_Ugw58oX9w…
Comment
I mean there are populations of people that are being wiped off the map right now and that has nothing to do with AI, well very little anyway, humans can hate a whole lot better than AI will ever be able to. We don't need computers to F ourselves up, we are quite capable of that on our own. The guy at the start of the video saying "we have agency, why would we make something that could hurt us, we just don't make it" what, you mean the same way we don't make guns and bombs anymore because they can hurt us, obviously I'm just pointing out what a ridiculously flawed argument that is. If one country doesn't make it, another will and they very well may be your enemies.
youtube
AI Harm Incident
2025-07-26T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwLl7k5KIo1GWqYmPN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzilMHBtAM1v95ykKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwkd4r0xRW8kbkCIyN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwgaa9KNdrItNIcCbF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwAIXo120oJIJ4Q_0B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzHURxaTi6JLt4yDDh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwmSMJKgn34KLZEG6p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZsUv6OkQY_tfdstl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyFtIOUQqK4n9V4PmJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxt2o7opW8gdwpZyHV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"unclear"}
]