Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't know what they're complaining about. You're just posting your own art, t…
ytc_UgwYjhldz…
G
1:11:30 This situation makes humans the competition for resources that the AI sy…
ytc_UgwnHk75f…
G
People are really stupid enough to try using Ai for millitary purposes? Anyone e…
ytc_UgzbJVDWL…
G
Not all of us, just a significant portion of us.
​
We have collecti…
rdc_eh6d0cq
G
There is no way I will ever use a robot to come inside my house. No way in hell.…
ytc_Ugw_QVe6w…
G
55:00 yeah because a government or private company having all the rights to ai w…
ytc_Ugxxtc6cL…
G
The biggest risk factor here is private companies owning all these ai clusters. …
ytc_Ugxfs4Kej…
G
These large language models pass the Turing Test that over half of all humans co…
ytr_UgyW0E_f8…
Comment
AI hallucinates because it's trained on us via reddit, Facebook, etc...and nobody every admits "I don't know", so AI assumes that there always must be an answer, so when it doesn't know, it makes it up to fulfill the patterns it was trained on.
youtube
AI Governance
2026-03-21T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyp0I2usYT0GC7x5xV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyA7v3P_2lJEPWw1F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwDEgCu0GlOmHrfs2p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBV7ze01zSJzgWsb54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyxysSxSJVJGKhYbWp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugws0vRXy_AXojH1WTF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyKi9_vtgIPKfpzgZx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxlw2FFBiUu0ygxYcB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxp_unGNnky7Th3GlJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugylbebws2UGC-q0K014AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]