Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
So wait, all those other engineers do what this engineer does? They also QC LAMBDA to check and make sure that LAMBDA isn't sentient, self-aware, conscious, and, perhaps, ensouled? So, in other words, all those other engineers all do exactly what this engineer does. Wow! Google sure hires a lot of redundant computer engineering positions just to make sure that LAMBDA doesn't become self-aware. Which, when you THINK about it, is kind of funny. They're building an Artificial Intelligence which they DON'T WANT to become [A.] Intelligent. Actually, that's HILARIOUS! I think Google execs and their lawyers are full of dung beetles.
youtube AI Moral Status 2022-07-06T15:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgxO-rVyuY1ETzCwA2x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgySmQNGN_wa6ypmKd54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwN4Op7sxTxfL9yeGF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy9I_lYi0biKBFPsMl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugye-KIhQoKsWQNmbR14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]