Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Matt-dn5jc That's a fair point. Maybe I'm just talking out of my ass when I sa…
ytr_UgwgnJPvm…
G
It's more like the moderator problem?? Why you need to change the style because …
ytc_UgwBnwshI…
G
Our best hope for UBI is companies that are not Ai will lose money from consumer…
ytc_UgxVp5-NW…
G
Nice person, but living in his bubble. He can't talk out of his bubble, which ma…
ytc_Ugw-0y4vI…
G
Art ain't fun anymore because of all this, so to hell with it.
I hope to reach t…
ytc_Ugwtsw2ey…
G
Well i was watching it on a robot....i guess....its not concious....so yeah no r…
ytc_UgyVQSGJ4…
G
Yall acting like one of the best early examples of advanced AI … halo’s Cortana …
ytc_UgxvxSv8z…
G
I have an idea what if we all post really bad art so a.i art is bad and then peo…
ytc_UgwLLf_fm…
Comment
Elon starts by talking about singularity which is so far from the current reality that it is hard to think of any serious reason to consider it except as a philosophical mind game. And yet he barely touches the actual threats that large language model based software such as chatGPT - which by definition is based on the foundation that excludes self-awareness and, in fact, awareness of anything beyond the model - poses. It's almost as if Elon would want everyone to focus on something that belongs to a science fiction novel and ignore the actually existing reality and let the corporations do whatever they want with no accountability.
youtube
AI Governance
2023-06-29T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugytf32EvTIo_-GuCZN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwER-6RMBTniDZ9kRB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-yUjy0f5xZKr4-rx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxIZu3iE7DNNTuyTul4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9LB9rNZz_uZgfcjx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyoK31LiO35c3tyOPN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrPNdPGIepHUqdS_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw3xrRmRS1i3w63FlV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy0uzkg0J2e1Bz607F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw19XWpswvcNLXWobp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]