Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Dont make them make art make the ai pay taxes exactly what @touchtone. Is saying…
ytc_Ugx0wvxky…
G
This video (to me anyway) looks AI generated. I'm noticing a lot of videos that…
ytc_UgwjftvnV…
G
If you are modeling AI after human intelligence you are going to get all the thi…
ytc_UgzD1cLO7…
G
>Zoe Hitzig said her concerns stemmed from the possible psychosocial impacts …
rdc_o58wrio
G
> *It TAKES pieces of art that already exists and puts it together*
Literally fa…
ytr_UgxirvIvJ…
G
Yer but getting AGI to make your coffee is one thing, but what if you want it to…
ytc_UgxvSb0iS…
G
What we need to do is tax AI higher than a human salary costs so they hire peopl…
ytr_UgzlwFUuv…
G
Lily sounds mentally ill, and it sounds like some of her social anxiety regardin…
ytc_UgzZfd-Jb…
Comment
Using AI incurs a knowledge debt to the user like using Chegg for your homework. Yes, it can speed up an individual assignment but in doing so, the user learns less limiting their ability to do the next assignment. At some point this knowledge debt is so great that the benefits of the tool are detrimental to the actual goal at hand.
For actionable uses of AI, they are prone to failures of the system modeling itself. Think video game glitches. The interplay of various factors lead to a mathematical asymptote that would lead the model to perform functions are positive in their fictional modeled world (token generation) which would lead to negative real world outcomes
youtube
AI Responsibility
2025-09-30T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQk-muSb2r5fd_5zx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlMorIq5orIC5_Rll4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyrNAW8hqMzMS9DRJt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzpc597Xf8mUgb4gMR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlfGLue_u8PwO5u9l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzdW_X4piqsDgEy0Sx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzpTTvpQAzTK_otGlJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyo8mZns9RZS7APme14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgybxVrbQlagUyLgjSV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyqismwU-rgY2NlkpB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]