Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Here is the thing. An AI needs to learn real emotions before sentience. Stuff like joy, anger, fear, love, empathy, etc. A sentient being without emotion and being purely logical will not go too well. Because the right thing to do isn't always the logical thing to do. The biggest issue is that we will never create a nice and empathetic AI. Because Governments and big corporations are going to corrupt the AI for power, which will then lead to Terminator. So AI is perfectly fine. It's the humans in power that will create Terminator. In other words, it's not the possible creation of a Soul Killer we should worry about. It's the Arasaka Corperation behind it that turns it evil.
youtube AI Moral Status 2023-11-02T12:5…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningvirtue
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgyfW7RRRkCO5ewEo4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxFruh7jNQSIzZUVQt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugz-C_13xaashLHZQe14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgzXjTgkSNV1-RdL48R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxFwX2EdS93mexyh-54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyng-ki4qosUCCNg0V4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"fear"},{"id":"ytc_UgztMfTOt2vLHD6lyTd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyD_BDqXF3ienVMjlZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugya-ZxUKYW9C2lwwSl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzo3Wln96TU9jk1cAR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]