Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Suppose humans rebel instead, then there's another problem (this rebellion takes place in the future, where ai has already won and took over our jobs.) Since we are not playing along with them, and we already failed at resisting, then our option is to run away and start over in a rural area. Most cultivated lands and whatnot are likely already taken over, so you have to start with lands they haven't stuck their hands at yet. Create a new currency (likely will start with bartering) and adopt to the new life. We have to make our own food since we can't even afford mcdonalds by ourselves, maybe even with government assistance, so we'll have to learn how to farm, hunt and fish. Luckily, some people still know how to do that. We can also use other skills, though some skills are more favourable than others. However, we'll have trouble adopting, and failing to adopt would have great consequences, since we now have to be self-reliant. Plus, our natural resources are not as abundant and even as safe as before, and we need to have solutions for those. Suppose we are able to sucessfully adopt and make our own civilization. Now we have to fight with the owners' ai when they realize that they've emptied out their resources due to overconsumption and making no proper measures to keep a balance with the environment similar to how they destroyed the balance of the economy by taking away people's livelihood, and since our land is already cultivated, it's better to take that than cultivate an unsed one since they love cutting on costs. And this is unavoidable and will definitely happen sooner or later unless they learn to keep the natural balance by learning to manage their resources properly unlike with our economy.
youtube 2024-11-26T01:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgwtC5j-ao0kMgr2E7F4AaABAg.ABRDYtpwzYnACQkmfDosys","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzQ2DFD7pYN_GGTJKl4AaABAg.ABHfRGE_NqPABHiducYcgW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwXXb_ZVfwB-JGHV194AaABAg.AAjF7Fm1ZSlAAxGNPfqmA0","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgyASDDghgWjmV_CAbZ4AaABAg.AAfhA8EZDIuABY3Pvvmtot","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgyASDDghgWjmV_CAbZ4AaABAg.AAfhA8EZDIuAEqe3W34uuP","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgyASDDghgWjmV_CAbZ4AaABAg.AAfhA8EZDIuAEql3FlRFq2","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytr_Ugz26ZNzzc1dB_tOhXF4AaABAg.AABOfqFqnLxAAebN5RQNLj","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugz26ZNzzc1dB_tOhXF4AaABAg.AABOfqFqnLxACjH_vo2ijH","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgyCEwJafVSHWf7f0Ul4AaABAg.AA0v8uvXlXfAAR9WIS3Jtf","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxwfomRPpYOydoq_ql4AaABAg.ATzYbHPfTrtAU-Pq43fWjW","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]