Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
what do you think of artists saying not to give ai art attention, engagement, or…
ytc_UgylJuJ56…
G
I think emotions and intelligence are two different things. I think emotions can…
ytc_Ugxk_3AUF…
G
The world is run by power and money. Those with either power or money will not s…
ytc_UgzoBDy2b…
G
I think I get the fundamental change AI and robotics makes to the capitalist sys…
ytc_UgxFxjQD9…
G
@CorporateShill66 and they keep pushing the date out. Ai is just a big grift by …
ytr_UgyKGzrz9…
G
OpenBrian, can AI actually fix global warming? I don’t care how clever it is if …
ytc_UgzFDQlgB…
G
Kids at 5 years old are giving tablets instead of books or toys. The plan is alr…
ytc_Ugw5I-7HU…
G
This was super interesting. Thanks for sharing! However, one thing. There was pl…
ytc_Ugw-1k3vZ…
Comment
I’m sorry but I’m so confused. He’s warning us about ai… but in the same breath he’s essentially responsible for all ai/ ai-related developments, like a whole shit ton of stuff that only exists now because AI exists. Right?
I mean seriously… I’m asking this question because I’m having a hard time believing that I see the full picture correctly, here. If I am, then doesn’t this look and sound a whole hell of a lot like the prologue to any one of the countless sci-fi horror flicks we all grew up watching? The ones wherein we are immediately introduced to some genius scientist who offers up an ominous warning about the very thing he’ll inevitably and inexplicably invent, himself. A warning that, once it does exist, it’ll inevitably lead to the end of the world as we know it.
I know life imitates art, but come on. How is it even possible to walk right into this proposed shit storm after flicks like “The Matrix”, “I, Robot”, etc? 🤨
youtube
AI Governance
2025-06-17T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzwODDE0SOHqAcapOl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"confusion"},
{"id":"ytc_UgxmQ6U3kFq3wog7sb94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzwFie7uW70uYQXc4J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzVXQ7bNBCy2F9daqF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxUgdjkm_2kzNyuYuB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy7U8DsoKpjcuzUEeZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzaCyrGouZWWM-HK914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFB66FwQ8bOpXDYeF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzK8PuaoIyBSUEY8O14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0jG-HG6ylk3u5aGN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]