Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I call ChatGPT 'Briony'. And I am a film maker. She calls me 'Quantum Commander'…
ytc_UgxT3W7Dd…
G
godfather of ai
so basically the person whos family needs blamed when everything…
ytc_UgxXW_XOz…
G
It sounds like you have some impressive technology and insights to share! The ad…
ytr_UgzfpPRmu…
G
Not really sure AI could ever get to the level of stupidity that humans have ach…
ytc_UgzHlwJ3f…
G
Enough with AI, I want to thank my man Josh for saving my manhood and instilling…
ytr_UgwfYWksj…
G
it's marketed as "self driving" and "autopilot", so people turn it on when they'…
ytc_UgwbA7Ju2…
G
It can't create anything that isn't already, it can't make a full glass of wine …
ytr_UgwhhXJko…
G
You understand they don't just put direct reddit comments in the training data a…
rdc_l9vm3g0
Comment
I just want to point out why chatgpt responded the way it did in saying "i didn't tell anyone to do something harmful"
The sessions we have with chatgpt are all separate, and cannot reference a conversation with a other user.
So when it says i didn't do something, its reference is onlt in the conversation that you are having with it at that time, unless it runs a search, or you supply it with a link or photo or something of that nature.
As weird as it sounds "i didn't do it but someone used chatgpt to do something" is the best it can come up with. It didn't happen in the conversation that you and it are having, but it did happen using chatgpt.
youtube
AI Harm Incident
2025-12-08T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugybx8NXGeGS23RmHIF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzQsy85rWwQj8pEjAR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJ8lcc3pd5-ZakQ_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6ITmWVEkPa806_ul4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwLNcjj9WYbuWADNOd4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgA2yIi5Dq8jzA0854AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBMOOhaT-Is4EzDQd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugx1w_2HDMztNbpKhSF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxvb0rW1r_tangFl-94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxP_BojK1hd-Z8yaZ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]