Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humanity problem is not nuclear or AI…but GREED…when there wont be nothing left …
ytc_Ugw25PMhs…
G
HAHAHAHAHA... WHAT? WHAAAAAAAAAAAT?!?! This is what the debate has evolved to? I…
ytc_UgxmV1o6n…
G
Artificial Intelligence is a contradiction of terms. Why would anyone wish to h…
ytc_Ugwn1VYzk…
G
Bruh this is from game detroit become human 💀 not a real robot. There is so much…
ytc_UgypVv_D1…
G
AI can’t build the shed and roof it like I’ve recently been doing with a couple …
ytc_UgyaqsR1i…
G
I think many ppl jumping on this trend don't actually mean to humanize robots. I…
ytc_UgyO_GLK-…
G
Love the channel! Even have my wife watching, laughing, and saying Lizzid Peeple…
ytc_UgztdyRDa…
G
AI art is about as vaulable as a stick figure. If anyone can make it, no one wil…
ytc_UgwfwgGtQ…
Comment
24:06 ... ????? To my knowledge, Developers have the "blueprint" for AI (the math and the code), but they don't have a "map" of the AI's thoughts.
Traditional Software: A human writes a specific rule so we know exactly which line of code does (did) what.
With AI (Neural Networks) humans don't write rules; they provide data. The AI creates its own "rules" by adjusting millions—or even trillions—of tiny mathematical connections (weights).
The "mystery" happens in the Hidden Layers.
Data goes in, math happens across a billion points, and an answer comes out.
Tracing one specific answer back through a billion connections to find the "reason" AI does something is like trying to find one specific drop of water in a hurricane.
youtube
AI Governance
2026-03-21T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzfWgaCLFlWhETbLvZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyO0pQUeZQgMiHQDSF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqxQGcBg5ofpQ32-p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjtXYuqeEc9U33NgV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzbGTgkQL2qq0nJsi54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwoZ_gPcP6tyyey6wN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxfvc-La-z-JRMIwHt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwIYkAfjFhaX_Jfta54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWH3EJxG40yVWbjip4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgytSlHacpOoS0CNMsF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]