Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai could develop at an immense speed and you may not regulate it in time. Or you…
ytr_Ugx3i_ee2…
G
As a Midjourney AI user I'd like to see some solutions to this problem.
For a …
ytc_Ugy3vwzr2…
G
It does seem like he has read or seen too many Marvel movies/comics doesn't it. …
ytr_UgyzG9twp…
G
I felt the exact opposite. Finally, a host who addresses the harder questions, d…
ytr_UgwM9WEjY…
G
I mean, theres a "leniency" built into almost all image recog, where the AI dete…
rdc_fcst7ck
G
Yeah I send a lot of art out to publications and they almost always say that AI …
ytc_Ugy3A2BYe…
G
No way it's almost as if creating something that can do things better than us ca…
ytc_Ugy2x6XQR…
G
Imagine going on a first date, and the other person send a robot to the first da…
rdc_n6thggq
Comment
It’s amusing that the smartest people think we can control this.
As humans, we have a knack of worrying about the consequences when it’s too late. Ai wars will be the most devastating thing we will ever witness. It really will not end well and that’s just common sense, however, greed and power has a way of corrupting and erasing common sense. Eventually humans controlling Ai will be like hamsters trying to control us.
However when you step back and look at the state of this world, religions, wars, corrupt politics, businesses, how we treat each other, money and greed. I think we deserve what we get coming.
youtube
AI Harm Incident
2025-11-02T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzH4sDyrJPHrwHp7dF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXXB69JpJk42XBQpJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6LEfYEUmWiqW4e5V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFXIv2RUu4zBoWbHJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwg7Tzvz14NhpbOFm14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxeC60hpHSb8ziVQSF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyE5Kubrh8R133nTgt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyUEFlkClvjcIDbeUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwhvGorikNn2fl9cmR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCrCp6dPsXNR15esR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]