Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Absolute nonsense! 😂. There's no way on Earth that these predictions are going to come true in anything like the timeline proposed here. One AI 'brain' will not be building its own servers and designing its own chips in order to simply make itself obsolete. Nor will humans be sitting on their bottoms having a picnic every day and watching box sets while AI brain(s) operate bulldozers to extract rare earth minerals to manufactre 'new brain'. AI won't be building cars, making new tooling (and maintaining those tools with spares and oiling them). It won't be running TSMC and making super chips. Nor is it possible for a fleet of future warships, planes and tanks to be designed, built and tested super-fast! The energy infrastructure alone required to do all this would be off the chart. The notion that humans would be so stupid as to build rockets and a massive, inconceivable space ship in order that malevolent AI code can explore the universe is frankly ludicrous. If any of that comes true even within the next 100 years it would be miraculous. Add 100 years to the timeline I might buy 50% of this dream 😮
youtube AI Governance 2025-08-03T21:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugw8bo98hbm2kDYhxR14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyM9WSB6qtyZa-QI0R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwoP7V8ctjiaGvPawF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx2Gc7yuNbtqZV4d-p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzea5k0n-h9ZAwcihZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzVn5uSBg9vENreBSV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugyt_5Fdn3CPpePutMB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgyC35tQOQ_30F_m4IJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzjrkFx9LKpx0sAWSB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgxWU1_DDHaF378hqGB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]