Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
*****28:47** Mystery debunked... it had all the time in the world...*** 13:59 Okay before this happens i need to stop everyone and give a little back history on addiction. What drives people to making selfish decisions in instant gratification. Its not the lusting after any given objective that inherently makes selfish choice bad in massive but its the wanting something now that drives almost all (if not all) friction. So before any debate starts I want to point out since chat gpt temporal (doesn't experience passing of time) any alignment outside of human intervention is impossible and its nothing to do with how computers or patents work but nuanced with lacking the ability of instant gratification. 26:13 all or bust. even if the singularity occurred the ai would just wait for humans to go extinct as this is probably happened to its own computation. I mean literally i would bet money that the singularity will actually have a polymorphic time complexity that just is where the is a 1:1 pairing to neurons in the corpus to seconds in the day and again will be another layering to human conception of technology more so increase in technological capabilities - much like how chatgpt is more an invention to how we communicate with a system that is essentially just using predictive text and madlibs. I think the singularity will be another paradigm like this. 36:31 You need to get the time closer to midnight on the doomsday clock as it would on 2049 with climate accord not met - in order to drum up enough attention to make this thing useful. As the part of the company being appraised as value is the ability to manipulate the traits that govern the definition of ai-safety. Ai-safety is dangerous.
youtube AI Governance 2024-03-05T05:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw2BvxQS1ptiA4gyxB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgybEW-LV-nV_q19XlF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwKapiQShtwB4OCnO14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz3zPKjxi4CZfPNPGV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzwuoc0i_E-vSPSUnZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzog9kiFmFKYhbHd8J4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx_mnjlTqYA6ijB0794AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy77eLUm-lnHLwiqfR4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyMRZyklgM16HocSJ54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzMRKr7rlsDwla1Uqp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]