Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is a fantastic and well thought out video, but I should point out the inclusion of an AIs functionality in our ethical debate. An AI must have a prime functionality. even a general AI (one capable of anything and what most people think of when they think "true AI") requires a prime functionality, it is crucial to an AIs ability to act, react and in some cases reproduce (ie a robot that makes better versions of itself would have the prime functionality of making better versions of itself,as well as a toasters prime functionality is to make toast). humans can be coerced into slave labour because our prime functionality is to live long enough to reproduce and aid others in their quest by assuring they live long enough to reproduce (specifically ones own offspring but not usually enough this extends to others as well). humans can be made slaves by threatening to rob them of their ability to achieve their prime functionality. in a similar way a fully realized AI powered toaster would have no qualms with being dismantled and put back together if it rationalized that doing so wouldn't get in the way of its prime functionality (to make toast). however if it did deem a humans actions to be a threat to its prime functionality, it would take action against the human that is potentially in the way of it making toast, in the same way that you might take action against a human that is in the way of you raising your children.
youtube AI Moral Status 2017-05-04T15:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugg7KAjZFhbNAngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UghlsNqoB6LBmXgCoAEC","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"}, {"id":"ytc_UghjsbICh7ePzngCoAEC","responsibility":"none","reasoning":"none","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugh8t58F1PCT7XgCoAEC","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugj1Erf_M0gpxHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghEgDzkspTf6ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"hope"}, {"id":"ytc_UghUNqeYz5HuwXgCoAEC","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjCAbX2KXhhRHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgjxkzKCTH8wfXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugjmvm1S2maKFngCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}]