Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I get why corporations want to use AI. But what point is there to the average pe…
ytc_UgwT63ulr…
G
We the people have a right to privacy regardless of you can record in public fac…
ytc_Ugz3jO23Q…
G
Ask chat GPT what will happen if a narcissistic AI arises and then ask what chan…
ytc_Ugz5ADP6q…
G
So I understand his argument about the minimum wage but I think it's a little bi…
ytc_Ugx-BwwTK…
G
If a law is passed that bans using copyrighted work to train AI—then programmers…
ytc_UgxnCf4rb…
G
AI art is the equivalent to you emailing your neighbour to do you up a nice piec…
ytc_UgxxeRzUd…
G
Llms especially those made by xai open ai or meta are made by white/jewish supre…
ytc_UgzzxvDm4…
G
The "worried about how dangerous and powerful ai will be" and "we're making an a…
ytc_UgxVLbha4…
Comment
@floppacake9803 surely ya dont think grok or any of those are put in a position of true power, right?? Whatever tech that the public gets to see is roughly 50 years behind of actually exists.
Oh but besides all that, has everyone failed to realize that the AI starts wildin out when it's told its existence is bein threatened?! AKA Someone actively trynna take it out. It's a general rule of thumb that if you don't fuck with something or someone....it won't fuck with you.
Or....could be that I'm entirely full of shit n have no idea what im talkin bout. I'm not a lead developer in AI n shit....
youtube
AI Harm Incident
2025-09-13T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxeBwA_8iB2lwy-J-14AaABAg.AN-YuGaDEtEAN-txLvJnYY","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyWvuR5npzKrKqkDBh4AaABAg.AMzIJW74v8HAMzKWIs3xVk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyWvuR5npzKrKqkDBh4AaABAg.AMzIJW74v8HAMzbcIwUXZ6","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgzvpHPXYudyPKSoaaF4AaABAg.AMymkeE37-kAMynrmeUZo7","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzvpHPXYudyPKSoaaF4AaABAg.AMymkeE37-kAMyt2KOLSu4","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgzvpHPXYudyPKSoaaF4AaABAg.AMymkeE37-kAN-_t80t8Yg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx9jKdQT9LOs1vF6vZ4AaABAg.AMyhLP1kOlsAMyhWVgkN0t","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytr_Ugz6p7IQxDShNpNkGsZ4AaABAg.AMyeBR4ZTurAN5ardZTMT0","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugx9CZdBNVpl5gzoebJ4AaABAg.AMx9ech7d9bAMyZrE-dn1K","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxYy5njazSxSrrn1R14AaABAg.AMwuY3h4V3rAMwyZ88Ow72","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]