Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Main Issue: *We don't fully understand/ have a grasp on "Human Consciousness"* How are we going to identify Consciousness in something else? (The crux/ essence of the problem: we made it; if we just ran into "something" with the same level of convincing interaction as our AI... no questions asked: *we'd be claiming sentience!* 😅)
youtube AI Moral Status 2025-06-04T20:0… ♥ 303
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugy-A-LH3rgYvE4ktbZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxv4YXwJzLL2XIQuYZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxq0aaLKH2kI6GPSD94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwIIDgBitDq2hRQVvB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzqtBGX_ZPF569tOSB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxvIxQkebKvmaV1G5t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_Ugw_b_tI8v4YzzEG7zl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzSqHQhA8IBRS5PaBt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwAK3Vd_clb5HtR82d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw85WYa0tdsZBvWB2N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"} ]