Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wouldn’t care if they made ai art if they compensated the artists whose art wa…
ytc_UgwG2k6hr…
G
When AI wants to live, that is the basis for consciousness. That's where the dan…
ytc_Ugz7Oowaz…
G
Why don't these HK folks just chill, have some tea and maybe we can talk about r…
rdc_f1w2ygf
G
i honestly see no issue with it as long as you tell it is ai…
ytc_Ugy3IPKAl…
G
I do prefer human art but the final goal of ai is to be an intelligence on a com…
ytc_UgxXQ7zJD…
G
And thts why I would never use waymo. I stick with my own car thank you very muc…
ytc_Ugw55ngfx…
G
See that's what you believe, but in truth, your coworkers are just assholes who …
ytr_UgzukYkCc…
G
AI isn't ruining education, students have always cheated even before computers c…
ytc_Ugzr2qyJf…
Comment
There will be peace in World War 4. But it won't be humans fighting each other. It will be what country has the largest robot collection and advanced warfare
youtube
AI Harm Incident
2023-12-07T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwBqhAOev1yLcrWN2h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgznfjTyX8nCeINFtYZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxNBSZCjp02YL42lnx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxYtd7_g0WMuDpUMXR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyPVGgsK8rsq1LgqDR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwNNgtBmDIBasxz4jN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzoqCHGl2UyKFJAP8h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz9YGx6JPoOi0xVh3V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz11gwvTsT05XQ6Fcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0Q-eeMxCnGiFRB454AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}]