Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
PSA(?): For some reason I keep seeing AI-antis in the comments... like why? You'…
ytc_Ugw_ZIaKK…
G
imagine raising a new life with all the crap of the internet and hope it end up …
ytc_UgysRPOj9…
G
@MikeHunt-zy3cn Tracing shouldn't be real art if it was tracing a full on art p…
ytr_Ugz7msu6k…
G
It’s cool but what if the AI is wrong on some critical matters? Can we take what…
ytc_UgxjzSr2P…
G
We haven’t been hiring for years, but not because of AI. We just started our lar…
ytc_Ugxbj8OaH…
G
It's kind of arrogant to claim that some human-like computer with no living body…
ytc_UgxDZ-0Pw…
G
Nice, so does this mean woketards can focus the hate that was once aimed at vide…
ytc_UgxDvkkJw…
G
Ungodly amounts of fake news. CNN just trying to scare uninformed people who don…
ytc_UgxWHkptb…
Comment
46:10 but, please hank, language models _are_ just fancy auto complete. This is exactly how they are built. This anthropomorphism is giving people dangerously wrong ideas about language models and I implore you to not do that so carelessly, as a science communicator. Anthropics "research" on language models is, essentially, bullshit. You can take no text an llm produces as really representing some internal "thought" process. Please. As a machine learning researcher and science communicator, this is intensely frustrating.
youtube
AI Moral Status
2025-11-02T05:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwIzUhGmWf2FMjV0cd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRLbkND3Vr3UMZDh14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzbN6FoKfu1ifw7Mwp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTdMQru5F-usPjcFh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9ab7UrCz5EzfJ1914AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxunJljjY8zrwtEEbh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwV9TOvfdKivyn89GB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxS_ubIg4x8Zd08SSB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyM6Yu05uq-M12WpnF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx9u1McFkE2RE54CnN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}
]