Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I went on r/art and posted ai art to see what would happen. I didn’t get banned …
ytc_Ugx8xJz7b…
G
Hallucination is an analogy, not a literal term. But there's no evidence that co…
ytr_UgxKjiXlP…
G
at this point, tracers put in way more effort than AI artists :/ i hate that i r…
ytc_UgwRNt0ue…
G
You won't get this with midjourney. You'll need better tools, such as comfyui wh…
ytc_UgyScaBW1…
G
Honestly AI wouldn’t bother me as bad if it’d ask for permission to source from …
ytc_UgyawQcnF…
G
He’s not really wrong. The timeline may be longer than you think. As of right no…
rdc_jke7wyr
G
I've not liked AI getting better. I really did not wanted to think about it this…
ytc_Ugw_Jo-dt…
G
@flyre_flinnigan the concept is if alternated "poisoned" picture will get in to …
ytr_UgzSP2qSY…
Comment
@Spellbound_Rose AI can be incredibly helpful in a lot of ways, to gather information quickly, to help organize things quickly, help detect or calculate things quicklly.
Overall AI isn't a bad idea, it's an incredibly useful tool in supporting roles however there is far too little regulation on what you can and can't do with it in the west and as such things like this happen.
Not wanting AI at all because it has downsides makes little sense.
youtube
AI Harm Incident
2025-06-19T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxUkKlUlfhWQEWqw2J4AaABAg.AJZkdOfZc7IAJZlfEjuyrg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxUkKlUlfhWQEWqw2J4AaABAg.AJZkdOfZc7IAJZnRnyVc9a","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgxUkKlUlfhWQEWqw2J4AaABAg.AJZkdOfZc7IAJZrS2LMgki","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyjQKxdtADfdePK9xp4AaABAg.AJZkSetwxnkAJZu8BhUtUD","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyzlCtiEbK4EpwWz6J4AaABAg.AAg7zaKKuSLAAg9FvJ7cZP","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"sadness"},
{"id":"ytr_UgxnWzlKrMLuZkC6OFh4AaABAg.AAfgOxbjf17AAfx0Ohd7j8","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxnWzlKrMLuZkC6OFh4AaABAg.AAfgOxbjf17AAg1E4bRbED","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugw3YuheU_Symd9bE214AaABAg.9V0YZ2sgyoG9V2EgGk2JlS","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytr_Ugw3YuheU_Symd9bE214AaABAg.9V0YZ2sgyoG9V3ctoqsAuQ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw2XXuE8DVPeK1n6OF4AaABAg.9V0KrLilDYM9V0YF1wG-Uj","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]