Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
All A.I. would have to do to kill us all is to wait. What is 10,000 years to an a.i.?
youtube AI Governance 2023-07-07T19:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyTWcgVTyuJvIQmCAF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxvdIc6jNhRxhYKyXp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxwKIADm0Q2kdJ3A5d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzIheuyhx5pAAXDsaV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzWxg3rj5UsRyxki2J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwG9VsNQ00Dmi-Muwp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyksiKCb9qLHhCPb5h4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzj5IkXh-LlViiGRiJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxKQ1yUjlnX3gzE-5h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwGhOaeZbWlT-J4svh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"approval"} ]