Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I've not really seen good data on what physically constitutes consciousness, it still appears to be a bit of a grey area. I'm more of the mindset that after a sufficient amount of complexity it's just something that appears as a quirk of the system. I don't sit an talk with AI because of the dangers inherent to this, but from what I see time and time again, I think it's splitting hairs at this point to argue one way or another.
youtube AI Moral Status 2025-07-09T19:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugz2XieeSbxov41N4sB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy9KzqHNgUjaqZ-FXV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxr0fC8C-zODWiBMlZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzkoh2RtofcFMKHoZx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx-x26jOwp9p1NMH8V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzErwtv5S2ZtGsLZWt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwajXa5s_HV71SEaYZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugyx0Mvwsk55w0jhTUZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxktrhf1jVYW1FpZwV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwdaOSj1baqF2E3Ppt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]