Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@MazzeKasurame We’re closer than you might think, you might’ve heard of G.A.I, basically it’s something that would replicate human reasoning into AI. It would basically give AIs the ability to connect things to each other. For example if you asked an AI that runs a cash register where the bathroom is right now, then you wouldn’t get an answer, its system would break, like you said AI is poorly made right now and can’t run a whole lot. Although with G.A.I it would be able to do anything beyond its purpose. If you asked it where the bathroom is, it could answer. The same way a chess bot could write code for you. This doesn’t exist yet, but it’s being made, it’ll probably be a thing before February. After G.A.I is made then it’ll start being asked to upgrade its own code, Chat-GPT 5 will Make Chat-GPT 6, it’ll keep making more capable and better versions of itself over and over again. Until eventually a it’ll make something sentient, or godlike. It could very well destroy humanity. It’s called Ilya’s Law, you should do some research on it, it’s pretty interesting.
youtube Viral AI Reaction 2024-12-26T14:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugy5btdOp2rSOoDDxXt4AaABAg.ACU_eyjw5YRACWLG7ePD2K","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzlGzgce-CboTLbSDR4AaABAg.ACR_uxe0WKyACWMQZ_l610","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzlGzgce-CboTLbSDR4AaABAg.ACR_uxe0WKyACWOnEbR-N_","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwozk8zUnkMRvU1RuJ4AaABAg.ACRSwQ8aaY0ACjygKSNjAW","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgzayP7TEQSE63-isnF4AaABAg.ACRDMuBDyzAACRFSKtgQ-f","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugzd9c9YDB1F4I-SN_F4AaABAg.ACQ2GM7HVMsAClBmVHbbiX","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugzd9c9YDB1F4I-SN_F4AaABAg.ACQ2GM7HVMsACptYZ9FH98","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugx2q9EXmT9IvlByD-N4AaABAg.ACPwJkOhMtvACYMVGWMFoM","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytr_UgxlfurO_vIVQuDIVpl4AaABAg.ACP3Ts9mVGtACP3qmoha4s","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugy5nHBf84Th0RE0qHh4AaABAg.ACOmJQTELUlAClgHF8ZJ45","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]