Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Stage #1 Turing has a brilliant idea and writes some theorems Stage #2 Some brainy philosophers discuss the stuff in 1000 page long treatises Stage #3 Some mathematicians dedicate hundreds of hours of excruciating theoretical research to the topic Stage #4 Some scientists begin building primitive automatons trying to figure out the base of the concept Stage #5 Some advances are made, and the primitive models are beginning to take shape Stage #6 Some brilliant computer engineers build some basic advanced hardware that can incorporate the technology Stage #7 Some developers start coding software to take advantage of the hardware Stage #8 The code is used to solve some technological problem Stage #9 Some idiot funds a start-up in Silicon Valley to apply AI tech to generate funny animal faces on Zoom and MS Teams Stage #10 The tax payer ends up paying for more data centres fed with "clean energy" because "they will bring jobs" only for the purpose of people being able to have an AI powered funny animal face when on Zoom. That's the Universal Law of the Shittyfication of Technology.
youtube AI Governance 2024-02-18T20:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzXNVDFcUAsu0IC5Ah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxT3F1DRjpzMNCg9nJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxtGXPL5qOTbKzhesZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzpKR_DITG-pg0EUU14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwaFChjOx-zFkmWzNh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugxan297aCmYJ--6jXp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy-bY5_4lD_Sgua4p14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyPErZbKQaTt5Gmn3F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwmX9ujS4XSKX0zOiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugw1RjIX7h-gE4jTMWN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]