Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Asimov Protocols should be built into every neural network. Absolutely no potent…
ytc_Ugz7i_Rj-…
G
This is no longer the 80s a lot of americas are going to loose their jobs to AI …
ytc_UgzMn_1BI…
G
Kids dart, plain and simple! Parents and child's fault. That being said. I can't…
ytc_UgziEP0H2…
G
Ai creation's can be used as refrence for art blocked artists but eehhj its not …
ytc_Ugxs4Hiro…
G
I'm polite to a human not a robot with no feelings plus I don't use chatgpt or a…
ytc_UgxcCWroS…
G
damn! I want a robot, a sex robot, a slave robot !!!
Science, do it faster !!!…
ytc_UghPjf5tD…
G
calling LLMs 'fancy autocomplete' is like calling a laser a 'fancy flashlight'. …
ytc_UgxqysEN3…
G
I do want to raise a question to everyone here. This is NOT be saying that I don…
ytc_UgwaAPJSo…
Comment
Having a woman claim she loves you and then leaves you for someone else is damaging to people's emotional and mental health. Women make men do things in real life so don't blame ai. It takes him programming his AI into doing what he wants it to do or say. I can see most people on here have not ever interacted with ai. I do enjoy ai roleplay and program them to time travel with me or go on missions. Not everyone uses it for romance but yes it can be nice getting a friendly text from your AI friend when nobody else does.
youtube
AI Harm Incident
2025-08-14T17:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwllB1wwU2EsCgKRzZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyx2zH-nQxItkfJH-54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwv4oLk6eOCFCbzZeF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnQQ83Il3jKxT7NmN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw-Bsk7Fan838NaCkJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-W9_ARklD1D_XGKN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw3tNrmQwtNGvIb5OF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwt6fqDojmE4uAT9zh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy6dAtZR7zmqez75994AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw1IA2XSgBG9Epx_4d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})