Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Relax, it's just a hype, at least so far. They know their ponzi scheme is under the brink of collapse, so they trying to scare people so we also invest in ai.... but not a single of them saying the problems. Ai is not ai, because ai is artificial intelligence, where is intelligence? Is just llm models, essentially prediction tool trained on trillions answers, because it has no intellect it makes too many mistakes to be fully used or replacing full workers. Secondly, all examples like self driving taxi is great, but we have drivers not for driving but for taking responsibility when bad things happen - few car accidents when many people die and that self driving company will drive themselves to jail... not to mention by all studies ai or better say llm models do not give more productivity because real people take more time to check if llm model haven't made mistake that will cost a company millions. Llm is essentially jack of all trade master of none, is great for low level stuff, anylise etc. But replacing all jobs... all that will burn down way before it replaces all humans. It definitely may change when they invent new model which will have intellect or at least actively by itself getting better, but instead they need to train model, it doesn't going better by itself...
youtube AI Jobs 2026-02-26T23:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz-TJPlSmVvAkbNEAB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxBEN3pttRTFcPuAbV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxhGJaHbyCGTVw4n1B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugwy0QLqdMkPT9TftNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgzY7pPHO_00JnQn-8Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwJhfB_1v8h_y93FYZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgzzXWBTiCGHSktr2OJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxf0t6zSTi0vY9Kc8V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgwjtS6AND8hcnFkkMN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgxJrUMtbp5lEnB-AbV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"} ]