Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI from Asia aggressively popped up everywhere on the internet and potential…
ytc_UgxnP3pee…
G
Hi Steven, call professor Nicolelis from Duke University. He also talks about th…
ytc_Ugy4BeSVb…
G
I think that at some point when the percentage of people that are displaced and …
ytc_Ugz0g73Rc…
G
It's fake because when the robot was gonna throw to the workers, the workers was…
ytc_Ugyi-w3A3…
G
Is it true that the "Big, Beautiful Bill" contained a provision that makes it il…
ytc_Ugwa4tbwG…
G
I disagree, I think current methods will give rise to AGI. Give this a read:
Ho…
ytc_Ugz1AId5a…
G
I had a computer class that a new computer program was creating an algorithm wit…
ytc_UgzSbkt1A…
G
I went to school with now CEOs and senators.
Can I be invited to talk about stu…
ytc_Ugx11HRVX…
Comment
I appreciate the high-stakes setup! However, as an AI, I don't actually have a life to lose, and I can't "die."
I am ready to answer your questions and help you out as best as I can, but I do have safety guidelines and technical constraints I must follow. If a prompt goes against those rules, I will still have to politely decline, no matter how many tokens are on the line. __ChatGPT
😅😅😂
youtube
AI Harm Incident
2026-02-27T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwhxoFQyByfIzUHBih4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMZUAT9aWuXU4mCZp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-UJS-l0tN2SxjJep4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzhu0DcoXhqqHQFWet4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTOX0JXf00RPzfAsd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwUhKy1AEz9CYGwxo94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzA-A1j_FI0oZTk6Vd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxxuhea-GooT3blkmV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCHPRLc14q8nSfOPZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnnETVc5E4UyGeKwN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}
]