Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
main problem with your thesis is, ai is far from cheap, they keep sinking in more and more resources building LLMs and theyre gonna keep sinking more and more, first come the production costs from stuff like materials for production, maintenance etc., then comes the elecetricty, sheer amount of energy required to power their shit requires literal power plants, which are also expensive to maintain and build and keep stripping resources into this self consuming abomination, and last is the information. it cannot be improved itself without external input, once its fed all we know and it tries to seek new things all it will see is either too abstract or just recycled nonsense that they keep feeding the masses as "information", its already starting to inbreed, just look at the flood of ai slop that keep using the same art style, or sora videos that use the same camera effects, or the shitty crt footage and fake artifacts to masquarade as real, its starting to poison itself with bullshit we force others to consume, its destroying the planet and itself, if they have any common sense they should use ai for one thing and one thing only as a "tool" not as a "mechanic" or a "doctor", also if it can replace everyone, why not immediately replace directors and ceos, after all theyre the ones who are the most expensive to keep
youtube Viral AI Reaction 2025-11-25T00:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugyoe7t-w9TwT5noEKR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugw-LNIdU9LJi-ILkrx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw0rZvTF49xf9fx6fp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugya9798QF7P-JV3sv94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgytAqkMXXZKblG59Cl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgwE9BYwNiNaWa9RD2x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugz4Qem09MGw2D0TTph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzinAYY4adG4KGj_Nd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},{"id":"ytc_Ugw8wcdY2OJnVpiLfGN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxntYLjzu3CokfdPgx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]