Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We'd need to decommodify housing and implement a UBI. If we don't people will literally starve, live on the streets, and go extinct. Personally, I'd do art because I feel like it. We can still have human driven content creation just because. And jobs in space will likely become more prominent over time. In the meantime, we'd have more ability to travel, or build a family. Even if AI could cook my food for me, I'd cook just because and have a bot cook for me occasionally. I think these things can coexist and I don't think we need to or even should tie a personal sense of selfworth to work per se. I root my sense of self worth in learning and exploring the existence I happen to exist within. But that's just me, I suppose. 😂😅 I value freedom to explore topics and learn instruments. But I'm not particularly worried about sharing what I create except with those close to me. But I'd like more time and less worry to focus on the idea rattling in my mind and the art I'd like to create. 😅 Money and time are huge limiting factors at times. But I read textbooks (that I rip from dokumen. 😂😅) and watch educational documentaries about different sciences and nature often.
youtube Cross-Cultural 2025-10-28T06:4…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyJEBcaNt7H53vSJ-p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwCFCL9rYF6VI2bIq54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzpiuiEb3jh1bAMAY14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgymzrwoHGsf8iGGmA54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzGbAbmLgUmhs9GFA14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxC3CfM3bTj_buH-JF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyf_XGzOCiIq6Cxewt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxJPUjjCT7x9vn7xVR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwHDRRph3B2G3oigAV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzs3VJObHbGzah5C6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]