Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This scenario is unlikely to happen. AI systems are already making numerous mistakes, and operations frequently crash due to their significant power requirements, which are neither free nor cheap. The quality of information provided to these systems relies heavily on human input, and it is crucial to continuously supply fresh and updated information. As a result, decisions made by AI may not always be accurate, as every context presents a unique perspective rather than a straightforward answer. The main reason for dynamic pricing on groceries and everyday shopping items is to support the operation of massive AI facilities running 24/7, which is not sustainable. The AI bubble will likely burst before companies allow AI to replace human staff. Once that happens, companies may revert to employing more humans and using traditional methods such as pen and paper, old devices like faxes, or even speaking over the telephone with another person. Interestingly, the last major technological shift occurred in 1984 when, for the first time, every employee had access to their own PC. This was also the year when software like Lotus 1-2-3 was introduced to simplify tasks. Additionally, the film "Terminator" was released, highlighting themes that resonate with the potential challenges of our modern era. We should use technology as a tool rather than becoming obsessed with it.
youtube Viral AI Reaction 2025-11-23T12:1… ♥ 35
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyindustry_self
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyhz81-vY2IqXHkz_R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzr_UUybmygqhKzwuh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugy8GzGyi2UJyVbDKCh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw6HHi1PSfFgD0CRGx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyQztNGLjPJjlTHEvB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx-rxMFuLiB-qmi3sp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxF9r7eihPaVJ63oCF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzNL79La_b7h29jk6d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_Ugwna6Z20GYQyEIkD8d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwkzqkqHOZvIj-Bzsd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]