Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey there! It's fascinating how realistic AI technology has become, right? If yo…
ytr_UgzoX77q3…
G
I agree, like the internet we may see a bubble pop, and then “AI 2.0” will be bo…
rdc_oi488lk
G
@mierusvisionHave you taken an honest look around? Go ahead and find me a non …
ytr_UgxGfRJ6u…
G
I'm waiting for an ai to call me a monkey cuz that would be fuckin hilarious…
ytc_Ugy6DKSMr…
G
Who was the self-centered greedball who thought AI was a great idea and made it …
ytc_UgytMIivT…
G
Having just googled MSNBC,s coverage of Gemini woke AI this is us what comes up,…
ytc_Ugz_tGPE8…
G
Why do i see doja cat in the robot who is standing in the middle 😭…
ytc_Ugz5-PUbr…
G
At least have the robot wear gloves. The robot is punching with sledge hammers …
ytc_UgyNAXQVa…
Comment
This smart guy says we will just simply stop making them 😂😂 he sounds like my 4 year old. No way they could ever build their own army and write their own programming. No way a kid could program it to do school shootings. Those are just two heinous examples from a short bus rider.
Edit: basically the AI will build more AI themselves obviously with her own programming without man’s knowledge
youtube
AI Harm Incident
2025-07-24T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxux-vlC0QAoJ1V6Il4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQkS5EqElIyjZlolN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxvGGil7CMz76xBu0N4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx8xY-U46pZxqkrdm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzdR5tTtQ-_btn5KRp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzrDPFmB3tq0oe2Awx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxk72A-Gd8BxQLr1GB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzO_quQoh7f0Aey3WN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyu3ZXQgDYu7TjRlgt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQObOk8sBy0sy1_HF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]