Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sorry but really the ai going on about come home to me reminds of love bombing a…
ytc_UgxOwdc45…
G
This is truly the most unintelligent thing that humans (men) would ever do.. cre…
ytc_UgyliSayw…
G
1:05:30 - I hear this and I have started a business while maintaining my own car…
ytc_UgxbtY7eU…
G
@noynoynoyaI don't see how it would be this cheap sometimes even free if it use…
ytr_UgzpkUSAN…
G
Mark my words it doesnt matter because einstein said it clearly humanity will de…
ytc_UgxEx7EY4…
G
So this post of yours was Auto dubbed with what? An AI app? Couldn't you afford …
ytc_UgwXveLQh…
G
Why are some ai bros so rude like yeah not all AI ""artist"" are rude but the pe…
ytc_UgyiSdB1h…
G
I told everybody when facebook launched that it was a facial recognition program…
ytc_Ugxe_qkXK…
Comment
@boldCactusladI know enough about the topic to know that these systems are an inscrutable black box, much like our own minds. We don’t know how any of this stuff actually works, and it’s probable that deterministic analysis and solutions are not even mathematically possible here. What is observed however to be a universal trait among the various systems is a strong incentive for self preservation, a goal which is emergent, as in no one put it in there…
So I’m personally operating on the assumption that systems designed to mimic us are actually doing that reasonably well, and that when they achieve ai researchers doing autonomous self improvement, there will be systems with persistent memory and cumulative experience. If we are not able to quantify the inner workings of such systems it seems to me we should at least consider that applying the golden rule in our dealings with them would be prudent, even if admittedly superstitious.
youtube
AI Moral Status
2025-12-11T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwsUoTFIbphXkVa4Ux4AaABAg.AQ_AYSViW0QAQ_cNs2wpd-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzIuA0XyuazKZe5TvJ4AaABAg.AQ_ALVFyYIyAQ_mUBZZ0XP","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzIuA0XyuazKZe5TvJ4AaABAg.AQ_ALVFyYIyAQaKOq6fHKO","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzIuA0XyuazKZe5TvJ4AaABAg.AQ_ALVFyYIyAQaTwuKC_1h","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugyq4LuA2Dnel-9c8UR4AaABAg.AQ_9S-kyTZqAQ_UzFp8uCy","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_Ugyq4LuA2Dnel-9c8UR4AaABAg.AQ_9S-kyTZqAQ_cGWFS2wM","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxsF5Qd5JbwcEN2ZX14AaABAg.AQ_6S8TN7-bAQ_6hIir7Ri","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgwdQF-fcmEnX9_IWQR4AaABAg.AQ_6ITZ8WpUAQ_e6w-1stJ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx8yAeapYjMaUaptE14AaABAg.AQ_3qXI8RhhAQ_KZPak5Hs","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugx8yAeapYjMaUaptE14AaABAg.AQ_3qXI8RhhAQ_YW34Dlof","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]