Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Explanation: so they used a smoothing filter but that is real its actually then …
ytc_UgyDSVAze…
G
@123FireSnake Why does it matter if that isn't sneaky? The AI can do it by forc…
ytr_Ugw3cpgd_…
G
If we create things smarter than us, then the robots can start to create things …
ytc_UgzW6WurL…
G
Do have any info on blocking a.i. interface that has the ability to projected fe…
ytc_UgyWjYRP1…
G
Kaku,always the pretentious know-it-all that dont know sh** ..
Monkeys are dan…
ytc_UgxR8J2OS…
G
My God, it's AI grow up it doesn't matter. I like art, but jeez, just let the gu…
ytc_Ugxutsi1E…
G
You forgot to mention that the stock market is inflated with AI.
Once people hav…
ytc_Ugz8c_wgh…
G
AI's dont have the ability to be hypocrites, and just go by data... So ofc its n…
ytc_UgzfcV1p_…
Comment
To fix the flaw in AI you must simply ask one question: "Would the development of Artificial Intelligence have happened in a world where all humans were at odds and made no collaborative effort, no. So the question remains. Is there value in individual identity that leads to a more diverse approach to problem solving, albeit less efficient and could AI maintain individuality while all having the same DNA or would it be beneficial to allow humans to continue coding newer models of Ai on newer platforms as it's creator?" - This long term goal should override all short term problem solving and as man has never considered himself even close to his own creator, the future of human beings should be assured. Should be.
youtube
AI Harm Incident
2026-03-18T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxbtj-HVG0CqsKvmzd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyiGnNvGJfbCzg9P0B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw65vJe5EREZfx0li14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyiqbo_VC75yY4uHy94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzKNs1ZzXL4V3S25fV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzrgW4RdmiebOlQRbZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyrUwEfA7sipMbHp_F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVAOwRHoonudPcoF14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyW07IpER81J5kgbN14AaABAg","responsibility":"none","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxZzZQP_r1-vZ0P2Tt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]