Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If I am having to review the code then I should have written the code to begin w…
ytc_UgxNnjHbP…
G
UBI is collective ownership of AI. It is an AI Dividend, a return on our data in…
ytr_Ugw02Ina6…
G
"F word AI"
"proceeds to confuse my brain wether they are AI or Not since they l…
ytc_UgwgXVV40…
G
I would guess that the robot probably weighs in at about 500 pound, made of meta…
ytc_Ugx6otkp6…
G
Maybe in 100 years from now. I think the so called "Godfather of AI" is an idiot…
ytc_Ugxenq-8R…
G
I honestly hate my art style. Like sometimes I just don’t like my art for anythi…
ytc_UgwoqhQhC…
G
I think regulation is the key here : maybe it would be smart to require companie…
ytc_Ugy4bxjzf…
G
Well, he solved one of the problems with generative AI. But the main issue is th…
ytc_Ugz7dr3WE…
Comment
I do not think the real danger lie in conversational/assistant/"creative" types of AI going mad or making bad mistakes. I think the true danger lies in military AI equipping drones and other combat vehicles staying sane and making no mistakes other than a too vague definition of the term "enemy", which can be a human operator mistake, after all.
With the first AI drone capable of shooting down other aerial fight vehicles, including the ones with a human pilot, thus capable of killing humans, we made the first step on a path that we all know where it leads.
youtube
AI Harm Incident
2025-09-11T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxJV9hbZhJadOQNl-V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzlMcTajKdR2YtXJfR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgydD-EbHQA-dR-u1Pt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw2YAif-E2WymI75_l4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxquFDrS3vRFaHoN7V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQm-PY8NijX8owfbp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxy9DxGDei1SaiLAIh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyULKKf9nANMqXJwnh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzdrZhT9uOTz9jEfjJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5N_8asQTDsLZYhNB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}
]