Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh really? Then show me. Show me that a fcking machine without any real emotions…
ytr_Ugz6tJQ8b…
G
I think AI Art is great, I think it's fantastic that thousands if not millions o…
ytc_UgzqMYCE2…
G
I decided to try the chat A.I app and did a Mandela catalogue survival one. No m…
ytc_UgwB1LWpo…
G
People talking to chat bots as if they are real people is the surest sign that o…
ytc_UgzArkAon…
G
I still hate how people say ai is quickly becoming a super intelligents, but the…
ytc_UgyWs47zM…
G
I’m all for #fuckai but why are people get it g upset abt this? It’s not like th…
ytc_Ugwy6nMZP…
G
So he’s the problem this is the face of why my search engine is overloaded. I ha…
ytc_UgwiPBdiy…
G
Are you concerned about the rise of artificial intelligence? No but I'm very con…
ytc_UgwH_qoVB…
Comment
13:23 There's a famous example in computer vision where they trained a model to identify tanks from two different armies. It was a "friend or foe" discrimination project. The model got really good at it, but then it couldn't recognize more recent videos because what it was actually learning to look for was the difference in film grain from the two different video sources. It's entirely possible for an LLM to invent a kind of unintentional steganography by the same process, where the value it's getting out of its "thoughts" is wildly different from what a human would interpret.
youtube
AI Moral Status
2025-10-30T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwlf2ytab2xv8i3UI14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzHssYICRCJiDWOUw54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyydrhSRJE9cP4zynx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzfDdBUi-n-XDLG-dJ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzuhaUqQ1HoO_RSw4F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx5h2D0ojYDwU-jVdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxleDp9-cBbgdtRphx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxKs8NHbh-p4JQlPol4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw_NX6TOSnAkRz14BJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpbUfw6_xbYVbkTY94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]