Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Since we are conscious and we are intelligent and we call these models that we train on human data Artificial intelligence. Why not call it Artificial Consciousness? I don't know what that would entail since I don't really care about the "asteroid" as you seem to imply AI is for us. Since I believe everything we call AI should just be a tool and it shouldn't be our goal to make a AI loose the "Artificial" part from Artificial Consciousness more for moral reasons as well as as everyone likes to point out, ai might want to end humanity if the very worst scenario in every Sci-Fi novel or movie comes true. Vis-à-vis Skynet. But I'm probably wrong. And so are you most likely. Because neither of us actually has the slightest idea of how to build an AI like GPT let alone something many times (like at least 5 times) more complex like an AGI. And since we don't know how to make one. It doesn't make either of us qualified to talk about it. But take the discovery of the atom and nuclear energy. Some scientist thought that if you exploded a nuke that the explosion would keep going and destroy the whole world. Luckily whoever said that was wrong. I don't want to know what the future brings. I'm just hoping that ill be accepting of it.
youtube AI Moral Status 2023-08-20T21:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugypjv3bQ2Tz6_WpGpl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxz8U1BSVaQ54S54eB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyiOKIlotGd3U-H54N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw9zvAR2Zt7r1nO_4d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"}, {"id":"ytc_UgwadnjdnaiJXIk7-Zx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw6NBuAW8DASm5TgeJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwSR5nKfd2aD4v_3uR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzvVkHeFCMHKtXmkZN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxYh7MjLz_uAMCxhit4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzXX08fCS8uf74lRvl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]