Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Dan Brown's 2017 novel, Origin, fictonalizes the invention of a 2-story "supercomputer" which can access all of historical data and interpret it through an agent named "Winston". Today's quantum computers with control and cooling systems can be as large as 9 cubic meters (7x7x7 feet). Our AI also have names; e.g., Siri, Alexa, Cortana, Bixby, etc. The goal of Winston was to determine "Where did we come from; why are we here and where are we going". What are the goals of today's AI? WInston also had an automatic erase feature which was trigered by the release of it's goals and the death of it's inventor. Have any auto erase features been installed in today's AI? If not, why not?
youtube AI Jobs 2025-12-03T22:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxLmoPls3jkOTQFnNt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxyC05p0lfCwYbyj4B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyVXKDdiKa7ojp86eR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwqc1KqUHkrUAY4e114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzvodrN-5KPufQElWF4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwVQYmgIy6CR_9mKWp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugz9X8D_0vu0-SLT9oZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx-95kSpGUUkG6ItdZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyfUgT1C8GHo0jUSVh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx6SRm4mc5ANsX4tkx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]