Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I had phase 2 of this conversation with my CTO yesterday. Phase 1 was “we need t…
rdc_oabwn8b
G
AI doesn't do all the great stuff the tech bros claim. They say that bs to sell …
ytc_UgxpHPqqz…
G
@CRASHxy😂😂😂😂😂😂 what are you talking about? You don’t even know me or my history!…
ytr_UgzcwQUGr…
G
We're finally reaching real communism with this one🗣️🗣️🗣️🔥🔥🔥 If stupid capitalis…
ytc_UgyXlRTvN…
G
lol with the amount of salt that the AI art created, you could create a new Dead…
ytc_UgxVPRTir…
G
I think that technical people is gonna be very impportent because we need to mai…
ytc_UgxDyqMld…
G
Once we breach the gap where ai will be deciding whether to take a human life, i…
ytc_UgxzHY_dd…
G
Funny thing about self driving is, if every car becomes autonomous, and human dr…
ytc_Ugwc0uu-j…
Comment
The best discussion on these issues I have encountered to date. We need Alex Bores in Congress and continuing to show and lead us toward the best use of AI as a tool, not just an end in itself for the benefit of human society. Thanks Ezra for producing a clean, uncluttered, enlightening discussion. free of the usual hype and marketing which fogs the AI topic today. I hope in the end, after all these humans have been replaced, there will be new kinds of fulfilling work to occupy human minds and time. What will our role be in future society?
youtube
AI Responsibility
2026-04-22T02:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3kZF7XTBhPMiN-IZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxMhRvpwOgHmZxRWmh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyuUj81voCoOwyo1kx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx0GYNlUUcSrSqTYCd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxGsATK1RZyznOhT4d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzB_UC3BjaDx74Bhap4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxORgxyIRalS7qyQsJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxi6XyrjRaMBCE5PiZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyRA8-31QyLa_fd8Sx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyT7fe5VfHN7LSH8GV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]