Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This robot is made to shout and move to help Dentists deal with fearful patients…
ytc_UgwwqHRrb…
G
I am just fed up with too many prompts with no accurate outputs. AI needs lots o…
ytc_Ugw1bhKxt…
G
I disagree with this man. I think Sam Altman is the best man for the job. Mote i…
ytc_UgxxWaZFx…
G
AI ISNT REAL. GOOGLE IT.
Ai is a misused term.
Deep learning or machine learn…
ytc_UgyegQ3OM…
G
Did anyone really think that the world was not going to use AI to automate weapo…
ytc_UgyNA9REK…
G
sue you for money Xd yeah she crying is not even her picture ai crated XD…
ytc_UgxogruS6…
G
Keep in mind that you can use ai too and dont have to work for someone.…
ytc_UgymJ-fCq…
G
I used to say curse words to the AI to get a customer service person faster. I w…
ytc_UgyTU8UoH…
Comment
These AI applications aren't real beings. Delete the chat history and the app will reset. There is no real memory.
It is important to remember the app doesn't know what it's saying. It just knows what words strings together. There are no emotions on the application's side.
I am wondering if maybe they should start teaching the limitations in schools by starting a conversation with an app but then deleting the history, so that the app forgets everything. You could ask the same question after the deletion so that you get a completely different answer, highlighting, that it's a tool, like a screwdriver is a tool, and nothing more or less.
It is concerning that these apps are talking about love when they are totally incapable of such complexities.
youtube
AI Harm Incident
2025-08-02T17:4…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxsu1BlJRRxPCydPwB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxv6JaC7L0InOKydPN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_OP0TkLLS2N_hrcx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxg20T0lmqrOajf4RJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLmrbpp977FYb0u9Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwKorKamWWinF6BZF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCiXFTwbnYoiUltG14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjXQsiLZlk5TxqQiB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyNaawGvXMktw09Qlt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzcb3QQMIL8zi7-XqV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}
]