Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
*_The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn._* ~ Alvin Toffler What would it mean to live in a world where technology was so advanced that everyone could have a personal Star Trek replicator at very low cost. It's about to get very *_weird._* You know the Chinese are working on becoming the leading nation in the world on General Artificial Intelligence. What happens in a world where everyone, including you, agree that the machines make better decisions for us than we make for ourselves? I personally believe it will all work out in the end,... but it's going to be a very wild ride. To the extent Humanity believes that the use of socially sanctioned iniatory violence to be part of the solution,... will be the extent to which we lead ourselves into error. Pretty soon Humanity is about to get a new partner. It could be either a great boon or bane. Consider how infrequently fear leads to good decisions being made. Has anyone considered that even the simplest animals have self-will and self-regard,... but no machine or combination of computers has this quality at all? We've built cognition backwards from the way nature did it. If the Chinese crack this nut first,... they may well get everything they want,... or lose it all for everyone. I strongly suggest reading James P. Hogans, *_Two Faces of Tomorrow_* , for one illustration of how this may happen.
youtube AI Moral Status 2020-01-26T23:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxvRmbO-776mbs_gBN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzlYdKiGzbgH1qziiN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw4y0hErdGKG11ait94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgynBrwiqp2SZH5Nlh54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyjaJSOuKzLKV58NYx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugw56vZ9uqn3TBMnEOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy1PzQusXdtijEHd5p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxzdH3TKRSU0iKLy9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzatEFpalIQlbEj5pF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxZoIni1WZneRueg5t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"})