Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Is nobody distracted by the colorful objects on the top shelf behind him?!?!... ...but seriously. Interuptions: I've been working with chatGPT for a while and I recently bought the Meta smartglasses with Llama AI. I'm getting used to the different ways to interact with each one. The Meta is more like Siri or the Amazon echo, you have to call it by saying "Hey..." It's harder to carry on a conversation or get long formatted responses, but at least it never interupts you. In theory, you can prompt chatGPT to respond to a call phrase, but even with advanced chat, it still fails to follow the protocol. You can actually use chatGPT with the Meta glasses, chatting with the advanced chat, or even the 01 models through the mic and speakers, but it can't access the camera or analyze images. If it were possible to get chatGPT to follow a call phrase protocol, you could keep it running in your ear all day without being interupted all the time. But without access to the camera, this can all be done with earbuds. Solution: chatGPT and Meta should make the call phrase an option you can toggle off for long form conversations or toggle on for a stand by/ready mode. This would make it easier to do a direct comparison between the two.
youtube AI Moral Status 2024-10-13T12:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugwnyk3IPcTPbZoq05F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz_ofAisD9GDL9ehqR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw4hzG4fIlcA8PRlMF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyegJVkEYpYD23hS6N4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxD6Hv1PGbYfKpxXXB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyAPUS8zuZGdRsIurl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugz2_pETF4MfVIzih9x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxd3t02KkL6Cd_Z5e14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwTApziUzryhtp78EJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz2L_8kJjEAYNmR9xx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]