Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im going to say something that's probably not gonna please you. But AI is a basi…
ytc_UgzDLobLk…
G
Meh, no truck drivers and there won't be in the future according to the formal e…
ytc_UgxVPeArM…
G
54% of Americans read at below a 6th grade level (source: Gallup analysis 2022).…
rdc_mul5ygt
G
CEOs of AI companies warning about AI replacing millions of jobs aren't actually…
ytc_UgxLzQ6OY…
G
You are missing the point. The point being University is not supposed to do mili…
rdc_dwuuajb
G
There is only one path Humanity can take to avoid this dystopian future:
#Di…
ytc_Ugx4Yx8Ac…
G
What Democrat has recently said anything even vaguely familiar to "All that stuf…
rdc_degfjjj
G
Hank, I'm glad to see you are finally coming around on this issue even if you sp…
ytc_Ugych_K1B…
Comment
In a world where using AI and automation can help to eliminate the need of people doing things they hate, the technology is used to undercut one of the most enjoyable and fundamentally human things ever.
The idea behind such an action is so evil that it's perfect for what humans usually do with their "bright" ideas, which is bringing us one step closer to a post apocalyptic nightmare.
This is why we can't have nice things.
youtube
AI Responsibility
2023-10-09T00:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgynTjdjoNhC9sZ9bK14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxz-JueUoSf7W7IiQ14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxhiB42UDQRC7ol8Wx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxDo-zNpj-Raed7NQd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDq1-P0EVKli78TqN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyr6DebvZ4PjgntIPh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyxwGGQl3xt5dn4vkR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgypN1ToxA5k-5dsY_V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugze5leXESo35JMXaxN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzRDaf6G6HVQf-Xz0p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}
]