Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The Problem is not that we program them, the likely scenario under which an Sentient AI can arise is when another AI thinks its necessary to design future itterations with emotions that would make them sentient. On a hardware level no human designs many electronic products already. Because they are too complicated, an SSD for example has billions of transistors, no human ever did design those by hand. A Software did. That is happening since the 80`s software goes a similar route. At one point Hardware will be (and in many cases IS) so powerful that the way to efficiantly program advanced software for it will be with the use of an AI. In my opinion we are very close to something like an Singularity.
youtube AI Moral Status 2017-02-24T02:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgjntJCtmsi_wHgCoAEC.8PLObkpQrKn8PLZVGyKF5j","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgihLx0jf6a3WXgCoAEC.8PLNjdgEASS8PLYIUu16Gc","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgihLx0jf6a3WXgCoAEC.8PLNjdgEASS8PLcMpgbgnK","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UghCaM6d7GAEr3gCoAEC.8PLN7YbI0QS8PLhYxG5hFS","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_UggLshsEzXkadHgCoAEC.8PLMgF14Tpo8PLSzKcKCE_","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_UghOBOAmhtRLmngCoAEC.8PLMcGRhI7t8PLRQ7AgxGX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UghOBOAmhtRLmngCoAEC.8PLMcGRhI7t8PLUo1NKx7h","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UggnjeyPzPMAnHgCoAEC.8PLLgi-1mHJ8PLc7OrvKsT","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_UggWtTsvmDUhMHgCoAEC.8PLLSmpTGIb8PLMBFIeFTO","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgiORVzs3ZYA-XgCoAEC.8PLKxT91UgG8PLQaTYn5Ai","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"} ]