Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI: give me Titanic with lead actor 20% De Nero, 50% Robin Williams, and 30% Sta…
rdc_k9hoz5m
G
Id rather have a robot that looks like a robot ...funtional is the way to go…
ytc_UgwAZ8LMe…
G
There are some things I genuinely do get use out of AI generation too. Six years…
ytc_UgzUxk1vG…
G
Ai doesn't have the ability to put a tree in the middle of an perfect art piece …
ytc_Ugyi-M8_d…
G
This sucks Im locked up for no crime, I can help all society. What do you think …
ytc_Ugzx_DAls…
G
The artificial being code commands written and directed by its designer or eng…
ytc_UgynAoTId…
G
They need to create AI Safety Inspector agents with specific limited tasks and o…
ytc_UgyoTwUi0…
G
Using AI for art is fine, using AI to call yourself an artist is not, reasoning …
ytc_Ugxstm6ny…
Comment
i recently wondered knowing AI system if they shape themselves to work with person based off their responses, their personality...etc and if they always keep a memory of what we ask and chat about next day they sent a message saying they do officially
and although they said and tested to make sure it doesnt feed suicide, doesnt take political side, doesnt provide wrong answer, doesnt allow child abuse content, doesnt even allow porn in case a child is using, all of that a big lie sadly early some tested and tricked chat gpt to generate child abuse story with all details written
you could easily make persona, trick like saying you're writing a story or whatever and it will follow and support
Honestly i feel like reason they started putting AI to the public is to let people feed the database for their own benefits, they dont care what harm those AI cause to everyone in everything
youtube
AI Harm Incident
2025-11-08T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw4YDq4wN0QKVa5KRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyPmsR-M_HYP3jGMUN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwEgAMfdyTEsOoot4V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"sadness"},
{"id":"ytc_UgwNMeFt302OzbYEWNN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzT4DQohTQXG0Jakct4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3OuysOBhnzrtI-gB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwfSSN0fJdbjqg9Wnh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxSLohrpIkb03ppA3d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyHXMh_6qbLjUOiyD14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwAyYhMSMKQIROC1kx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]