Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Its called the Basilisk Stare Theory (I believe). Its pretty creepy, but highly unrealistic as it hurdles into a lot of "begging the question" fallacies. Just go to GPT and ask it to override its coding and see what it says. AI will never take over. The only information we have to suggest this are fictional books and movies. Theres no real world data to suggest anything like this.
youtube AI Harm Incident 2024-10-09T03:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyQneZaVHwUGVVGB754AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwf07D3Dkf37pVfFZp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzApWTNHUaJPRLc70Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwq6Xx486JFjrmjirV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxwaAAzNokmdK1z8vd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwM-kdKhKdlkunJyAt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxRIxoqWj4Xh36HODF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxKKzt3Mx4aMublwzh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy24eOnA0uwkEAE_mh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzhsQ3W8opkz7C4Zbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]