Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The main problem found from replacing human laborers with A.I seems to be that people are scared to lose their income. To quell this problem, A.I would have to be implemented onto the work force en masse and then place in a system to give citizens income based on a different variable, whether they're married, have children etc. To give them a decent amount of money to live on. Where would this money come from? The nigh on unlimited production value that comes from millions of worker bots finding, converting, exporting and using resources for the betterment of man kind. A division in government would be established to maintain the robots and give out programs to fit a need. After the focus on hard labor and agriculture has been removed from our general worries as a collective, more thought can be invested in the education systems and government. So that basic human rights globally can be put into check and we can keep on progressing. On a side note: Careers would be generally scientific or creative, this is because everyone would have more room to think about science and culture. Artists across the spectrum (excuse the pun) would be paid more and so would scientists. Maybe the income system could relate to how much on average a person gives back to the community, country or world for that matter. So that real human feats of intelligence and creativity can be rewarded. You couldn't expect a robot to be a master chef or a theoretical physicist could you?
youtube 2016-06-09T11:1… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyliability
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugi0VpRQcJ-dlXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiNbJ86LvBRq3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugh1IEiVJXdTL3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UghLxJuzfSM6WngCoAEC","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UggsqieRRiXa53gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugg4GbRqmgOic3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_UghxMbj5aY2YSHgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UghdOZ7MnsH1QngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgjiENPkdW5qpHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UghmqNgxlJSGC3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]