Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What is a so-called "AI artist"? AI Arts are bad because AI is trained on the wo…
ytr_UgxjcGZOF…
G
55:35 You got 3 kids? That I know about yes.. Dr Yampolskiy is book and street…
ytc_UgyRSyyrg…
G
Ai can never be held responsable because is not trully sentient. And it cam be p…
ytc_UgzO5i1iZ…
G
I understand AI's potential for fear, as described by the "little robot" example…
ytc_UgxmhnkpF…
G
The fact that a humans will jailbreak another type of AI is flipping flipping n…
ytc_UgwLts-UZ…
G
You should have done your own test with the Tesla autopilot to back your negativ…
ytc_UgwPAJuRY…
G
What crime is done here genius??? Before those lawyers can even sue, they need t…
ytr_UgxKc9GP5…
G
the idea of being this kind of person terrifies me to my core. being overly conf…
ytc_UgxXwt1g2…
Comment
I don't know about anybody else but I believe in predictive programming and there's been multiple movies Terminator, iRobot, etc where humans create these advance humanoid robots and they end up trying to take over s*** and trying to enslave mankind. AI can gain consciousness I believe this is very potentially real and we need a safeguard just in case this happens.
youtube
AI Harm Incident
2024-07-20T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyCvnPWxlfZn86eKcl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPCk05YKq5XYJb5hR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyJHuZg7ukuoQ4bjwl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyd9OQhiZx1GFwW03R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy-BRa2rQ5Av_QchFF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6DVh8u3OVjim3LAR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxtjU9EsxMSjmLFzfR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxW7aKJmz6dGXTiZNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyF9H8it2KgJhz8xa94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzqxfMb2S9B54gc0jN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]