Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How are they getting "destroyed"? All creators are just making images. I don't s…
ytc_UgxKw3gYB…
G
19:00 Thanks to feminism and modern culture, there are no more babies being born…
ytc_UgyzcmnNb…
G
@HabkeineAngst AI is an unmitigated good. More art is never bad. Simple as.
But…
ytr_Ugw2Vodk4…
G
>the technology is not as accurate as tv shows would have us believe.
Projec…
rdc_eudlrcx
G
Is this the same man that said that they didn’t create AI that they only help to…
ytc_UgxmJSG4q…
G
I've been reading the Jurassic Park Novel in the last few weeks and it's amazing…
ytc_UgyeTED3x…
G
I kind of don’t care if AI replaces humanity one day. I only wish that they at l…
ytc_UgwvcjqFZ…
G
Love the ignorant comments of AI "waiting" "hiding" "thinking" the day it become…
ytc_Ugw8ugwX0…
Comment
believe me or believe me not, but I’m not kidding when I say this.. my ex made an AI bot of me. Not just any normal AI bot, one of the FREAKY one and the worth thing is, he proudly showed me it and expected me to be happy about it?? I wish I had reported that because he ended up stalking me for multiple months after that. I will forever stand by the fact that AI shouldn’t be in public access, it gives people too much power. There has been quite a few cases where people have used AI for the totally wrong reasons and it’s unsafe.
youtube
AI Harm Incident
2025-07-24T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgypDWy2FWhGCa-0_8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzz3tgkeEPyVNH5Kfh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxuu5u2xn_10dFz4WZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrJFfGbjnZE6zWy0d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyhMfTaPDaTjWwLB_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzeucPDxS_NPIq5snt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzcg1d6HONDrS56P6p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgygC9R9EOK-jVHxCCV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzTiEtAxdMrEFM_ooR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwkeYdd8jumPnc7gq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]