Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
​​​​@SamTheHedgehogTUOS Well depends on us Humans Think of Intelligence this way There comes a point where sentience develops into sapience Sentience basically means having a consciousness That includes Animals Higher sentient beings includes Humans, Chimps etc But Sapience is different, it involves learning, improving and predicting etc This is a property shown by no animals Only humans Now AI right now isn't sapient But if we keep on improving it, in the near future it will be It will be like us humans, it can decide what is right or wrong, it will want rights, it will want love etc Now do we need to fear that? Depends on how we treat _that_ future sapient AI As that AI is just like humans, if we treat it right, it will live with us in harmony If not, frankly speaking, we stand no chance There's two futures for Humanity 1.We kill ourselves(Pollution, War and ofc AI, that will be our fault, not AI's) 2.We act peaceful and just, AI will bring us thousands of years into the future Essentially, it's not the AI you should be afraid of It's the Humans When it comes to replacing us YES, it will replace us(in far future ofc) But think of it this way, if AI will do everything, and we'll be still in control Wouldn't it be an eternal vacation for humans As I said, the only thing that will mess up our future is ourselves A gun doesn't kill a person, a person does
youtube Viral AI Reaction 2025-10-12T12:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxJFBCu7asM03ZpPIh4AaABAg.AH7hCwC_u29AKzkL-kcutD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxJFBCu7asM03ZpPIh4AaABAg.AH7hCwC_u29AL7ZRQMKhuH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgyCrWKBRrPEksMVeeN4AaABAg.AH7Yt91RGk_AKfYd3ygnNK","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyCrWKBRrPEksMVeeN4AaABAg.AH7Yt91RGk_AOAueGcA0Ie","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyRO4ANdScY54fECmp4AaABAg.AH6qsH5Vap8AH76MOPV3bx","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_UgyRO4ANdScY54fECmp4AaABAg.AH6qsH5Vap8AH79DBSv3lh","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyRO4ANdScY54fECmp4AaABAg.AH6qsH5Vap8AH79gVZFLCN","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgyRO4ANdScY54fECmp4AaABAg.AH6qsH5Vap8AH7A_GOhv6e","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy0Tm0kbFRsN2WkENV4AaABAg.AH5sBkY4BKkAH76l5cfzbh","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxqdpSsvRqcoOB09Ux4AaABAg.AH4_SLh59bzAH4pO-qTTVd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]