Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I heard enough. That robot just said she's running the show. Not the creator. Th…
ytc_UgxKRI579…
G
Thank you for your comment, @gabrieldunne7323! If an AI made the video and it's …
ytr_UgxleurLP…
G
7:10 Whoa there my guy, did you just say you can accurately distinguish between …
ytc_Ugx5xFKsP…
G
I'm sure the AI that is available to the mass public rn isn't that intelligent. …
ytc_UgyL7G7IB…
G
I fucking DESPISE ai art. Like, yeah, we have skill, maybe we were “born with a …
ytc_Ugy3gMLPr…
G
Using algorithm to see future crimes that a person might commit? Isn’t that Mino…
ytc_UgxRzaNte…
G
I empathize with you, but have a misunderstanding about how these AI's work. Th…
ytc_UgyQ9xaqv…
G
The irony is that stories told as warnings are basis for lines of thought in the…
ytc_UgxwRYHQi…
Comment
If ai has the capability and intelligence to do everything for us, who's to say it will want to? Why wouldn't it serve its own interests? If you look at humans for example, we serve creatures like dogs and cats with less intelligence than us only because we love them, but things like ants or flies we wouldn't care for. If we do reach a point with AI where none of us have to work anymore because everything is automated and we all live on universal basic income, it'll probably be very temporary. I don't know the future but I feel like we are not going to be on the top of the food chain if technology keeps progressing in the direction it's going
youtube
AI Harm Incident
2025-10-14T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyXALY1dnlBCA-oZ-t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQgkxDyCkZX-tCleR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx79ywSX0Z_tAlsKpN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSu7kAUMYix3TvOMx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxjPOVPBg9re8rohlh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw2fI1zA1u4aWteZf54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxsjtdsyiZ4Q1QvJiB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyv9jyuR8cXywYnn4V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxMGrkvz7gjBOaqALV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0NDcsAeJAvsWNGfl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]