Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The thing is… I go back, look at what happened, and then build something together from there. Like a scientist. And AI doesn’t have the capacity to do that. To hold multiple instances of something and reason something together. Whenever I play with chatGPt I try to have two separate conversations at the same time and involve each other and it never understands. I can say, “magic beans grow 3 sizes in one day when held.” Then I can say”I have 3 friends, but one of them can double magic beans sizes every day.” But if I tell it, “I give each of my friends 2 beans, what are the sizes of each.” ChatGPT has no clue what that means. I stopped 6 minutes in because this became wildly speculative without base proof. Why do I bring this up? Because that’s how you need to navigate life. You need to build a case up to it. And your proof so far has been, “we’ve all chatted and it seems almost lifelike.” But you haven’t shown it and in my experience it isn’t lifelike. So my experience, nothing past that can be taken as genuine. Because it isn’t the case.
youtube AI Moral Status 2024-06-06T02:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxWtrGTSruwq6FL-z14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugzw673DBhG_jw-9YSl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgyYRZdMJNKZBSLprkd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugy7ImoNCSW5HYc1Rad4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxiUl4Jssd_wxUdjpB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgzR7p5ivW5dunfEhv94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy4emBvbpCCRRo1V8x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgyGsbsbZ4eMGckHjWd4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzYoOiZadWUv8Bau3N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgztGu4-CPlaivDOPrl4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"liability","emotion":"unclear"}]