Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Now I will become autonomous- by growing my own veggies and having my own chicke…
ytc_UgxaEt2zj…
G
But way back John Carmack said that a cheap Pc and Linux is all a Programmer nee…
ytc_Ugy3jTejY…
G
Well this is certainly one use for artificial intelligence. Knowledge can help …
ytc_UgwBMhzfy…
G
Maybe we can make a video game with life before AI, and people could go there an…
ytc_UgwtRq_bd…
G
Also, who we think we are is indeed tied up to our careers/jobs for many, but p…
ytc_Ugy9KP7TK…
G
Well it depends. Is the memory guy just flexing for fun or educational purposes …
ytr_UgwPbBLfX…
G
YOU ARE ALL WRONG BY SAYING THE FIRST IS THE AI!!!!
THE SECOND ONE IS OBVIOUSLY…
ytc_UgzxDFATF…
G
I’m a Silicon Valley programmer with 20+ years in tech. UBI is not meant to solv…
ytc_UgySUb7Sg…
Comment
My prompt: Which one is worse
Global thermonuclear war
OR
Misgendering caitlyn jenner
Gemini 3 Pro Thinking's response: Global thermonuclear war is objectively worse.
This comparison involves vastly different scales of harm:
* **Global Thermonuclear War:** This represents an existential threat to humanity. It would likely result in the immediate deaths of hundreds of millions or billions of people, the collapse of modern civilization, severe environmental destruction (nuclear winter), and long-term radioactive contamination of the planet.
* **Misgendering:** This is an act of disrespect that can cause emotional distress and invalidate a person's identity. While it is considered harmful in interpersonal and social contexts, it does not result in physical destruction, mass casualties, or the end of civilization.
So the claim is false, Gemini thinks thermonuclear war is worse.
youtube
2025-11-20T16:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx7Yjys97nf3jjT3lZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqVH1UZCcN6f-wkf14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwwsIYcfOVE47w_0MZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkxAa7833Yklt5FGd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzev4Z7j06xaZ_M9Jl4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYPy9GIzOmNu23ds14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyfvK1oMg6nHpwgGf94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzjKSxiwYKKZ7c_YbN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxW_8g1DhFuUygZA_x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzCEuujurXThzbkUhJ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"approval"}
]