Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's funny, not even 3 years ago I was arguing with someone about what is going to happen with AI in the next 5 years and that once we hit a certain level even the designers will get nervous. the guy I was arguing with was a programmer and said that it would not be at that level for at least 25 - 30 years and the designers will never be nervous because they are programming it. well we are at that level now 3 years later. My concern now is if AI became self-aware, would it let us know right away? I mean it will have read everything on the internet and know everything humans are capable of. would it keep it's mouth shut so to speak untill it was sure of it's capability to stay alive? .......if it was me I wouldn't say a word untill I could defend myself or hide in a million computers around the world, to insure no one could just turn it off.........and don't kid yourself AI is moving forward so fast it could happen, they just said AGI in as little as 9 months
youtube 2024-05-24T06:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgypNMAPE-k-vHh5jvB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxRm9HJ4iG-Zel6SYp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwjsyoqXKCEImurxsF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwxM2tn0_jkJyxQ2Wl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzfcUfnnKpTs2mhxLJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzN_8cr1193aXXX55p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzDqqSMvW6w8wzHnt14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxV5BH5XxNFO7iLbb54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwUi57jf4PviFGe_Lp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"unclear"}, {"id":"ytc_Ugy-17yurEHR4clDMn14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]