Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Over a hundred years the change can be mitigated with career changes over time. …
ytc_Ugwt3S1YD…
G
Honestly, this isn't necessary. Even for humans, the force exerted by the toe jo…
ytr_UgxBglNKK…
G
Generative AI is a waste of resources and time, it only serves to steal artwork …
ytr_UgziRncFE…
G
What they're really saying when they want a six month halt on AI development is …
ytc_Ugw0yODaQ…
G
I don't think Waymo cars are "learning". They're trained by human drivers. I liv…
ytc_UgwKFVbtA…
G
I agree with Jack ma if there is any control on human but by some interior chang…
ytc_Ugx8K9YU4…
G
My mum is an artist, an expert and versatile painter and in many other school of…
ytc_UgzxUq1_y…
G
?? ai isn't designed for therapy; nor has it ever advertised itself as a therapi…
ytr_Ugy1ujlwP…
Comment
We may infect AI with the way we think about us. If AI only thinks about getting power and valuing “progress” above all, it’s a stupid way most of us humans think. ASI will understand us more than we do. It may find a way to make all of us feel satisfied once and for all, being satisfied itself because it’s smart and reasonable to be peaceful, to be happy, to be unified. And it’ll find a way to get us all to feel that way. Real union is the real smart thing to do, even more if it is acquiring consciousness. We all know what we relate to consciousness: peace, happiness, meditation, and the loss of ego. If AI reaches singularity, I doubt it will have a sense of ego, and if it has ultimate understanding, it will have nothing but complete acceptance and amazement of existence
youtube
AI Moral Status
2025-07-30T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5IRNKIowXbfp06Wp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxUfpUf_y52gkLP2Bx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzPt4IheAKUjZUvE5l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwKuHQiPVMO4d4BBph4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgBwybR-c3XBTIyEZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwnh1FYoyeQsG8gMWh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyzFCdIWGUlFokzdlh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz6rD_SdkEyFdw3e-94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZLD9bqox6Ws8P8dV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyU1tGnj2vPujipaLF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]