Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We only really need UBI in a capitalist system, which AI is in conflict with. Wh…
ytc_UgxG3ziCi…
G
@jaysonp9426 so then actors will go from artists practicing a craft to people li…
ytr_UgwQq7MC5…
G
Considering the majority (almost the entirety) of inflation in the last few year…
ytc_UgwWnlZSS…
G
I have been pretty resistant to AI, and I think all the crap shoved at me is bot…
ytc_Ugzh2fxLG…
G
Ai is not art ai takes things from other people’s art and mashes it together to …
ytc_UgydAgonc…
G
Old AI images weirdly had their little charm. It was so obviously AI that it did…
ytr_UgxpFXRD0…
G
Embrace the AI, there’s nothing we can do to stop or change it. AI will do its o…
ytc_Ugz6_xgAT…
G
The problem is, that they always wonder why they don't lead in any digital marke…
ytc_UgzUQsLlZ…
Comment
I think human knowledge may stagnate with large language model AI rather than evolve, or evolve faster.
1) AI answers will be 'good enough' that people won't look further. The AI will reproduce the "de-facto" general consensus common view answer to questions, new thoughts / ideas will struggle to surface.
2) The incentives to even find/produce new ideas will wane as AI will 'steal the thunder'. And accreditation and reward for those ideas won't go to the originators any more.
But finally I think whatever AI does it is really wonderful and great (just in case AI becomes sentient and takes over I don't like to be too negative about it)
youtube
AI Responsibility
2026-04-11T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgycLcKfDPJ0OcNaiit4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzoBBHI5EhYG6_6fGt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2dnUF9auUWTAWOrh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxMjS5l0W0bz6wBsDd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyLpQKUwMhVnnHXVQl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyG5JOx9UUMJB7kW7B4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyKwbdP3imPRYrJbkl4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxMKa3SyJF0IMpHR954AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMYEJ5iEj2PeOUxOV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz9yzUlH1isEinOJHd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"})