Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Lex asks insightful questions and effectively invites Roman Y to participate in the convo and explore the threat from AI. Considering this interview (and book) and the input of Eliezer Yudkowski, humanity is counting on the people developing these systems to save their own kids' lives and therefore our lives. We may be about to come face to face with the Fermi paradox.
youtube 2024-06-27T18:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzVMpoQTwl77oyyzK94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyVTzGqDVa6Gocdp_N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwTmXflsrZvOqsydQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxqBv7kY4-LnkdKFu94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz2XUIFHC_UVxXR1GZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz8ctBEM7ir0D9WzlV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxXtSwI8t76z5xC7jJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugz7u0pBS5mp3_3BOZJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx8HzK8h1vc-HEe2Ul4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzAJQUv7UPQjENtRep4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"} ]