Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if a robot identifies himself (herself?) to be of the opposite sex from tha…
ytc_UggiLxFpt…
G
At some point the differentiation between humans and machines will be roughly th…
ytc_UgwvZcxlq…
G
It's coming faster than most realize, the smart phone has eliminated millions of…
ytc_UgzF_qdRe…
G
Him :-What is happy look like?
The AI - 😤
Him - what is sad look like?
The AI…
ytc_Ugwc3HRdc…
G
AI art is more of an art tool than anything, but it's a tool that automates the …
ytc_Ugz7BAEaH…
G
Congratulations on the collab! Cant wait to have you host AI DRIVR once we get F…
ytc_UgyNyJZDp…
G
I am disabled. Most of the time, I can't create art. Sometimes, the need to make…
ytc_UgzEsYZ3h…
G
what are you getting out of being a dick to AI? The AI has programmed guardrails…
ytc_UgwS61dyJ…
Comment
I mean, agreed....but from a man making a fleet of AI connected EV's, brain microchips & rocketships to Mars? 🤷 Color me confused about where he actually stands. I understand that he's speaking on "hyper intelligent" AI & I'll be the first to admit idk what the difference between that & what he dabbles in is. But this interview is in the 🤔 part of my brain presently. I have loads of respect for Elon Musk, but if I've learned anything in the last 4ish years, it's to question EVERYTHING. I find his spot on perspective about AI to @ least potentially not entirely align with his many presently active projects.....
youtube
AI Governance
2023-04-18T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzXMu57JeeayEzcSkB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyxhUeht46iRzz_WD94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzslWNQqsycGXHQSyJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyqdKoChngAkOPx5Gl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy09EU5moxy0fpGYr94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugw6_mKk-2ZHhV6qUN94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxfMiDCU2PBxlGXmsd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwuarHlg0mimytWy5Z4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwN1uNeLh6sJwetszJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzvSq6fcfiBAwXWL_t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]