Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
thats the problem with AI it requires brute force calculation to think and act o…
ytc_UgxqXNPQC…
G
But the thing is, AI doesn't even need to be intelligent or self aware like huma…
ytc_UgzXvz05H…
G
From a logical stand point if robots/A.I understood that they benefit from us (m…
ytc_Ugx0n1V_4…
G
If you have to be vigilant all the time, on the edge waiting for the autopilot t…
ytr_UgxCR6Zj2…
G
Sad aftermath of dishonest journalists & politically driven media.
Hopefully the…
ytc_UgwbhjkJK…
G
Folks ignore this. He asked the AI to assume a role. It is an actor and this is …
ytc_Ugxp68dtn…
G
Artificial Intelligence is the modern day Pandora's Box. You cannot put Pandora …
ytc_Ugzulch9D…
G
Ai has caused nowhere near the amount of income loss that illegal labor has. Why…
ytc_UgxgTm4fK…
Comment
Consciousness as a concept is fuzzy, but intelligence as a function of being able to plan and execute on plans in the real world to achieve an arbitrary set of objectives is oddly concrete. Focusing on an AI that can know itself implicitly and refer to itself as an entity with moral weight is "fun and Sci-fi", but wholly unnecessary for producing new technology, or wreaking havoc on the planet, or outsmarting us to achieve some alien objective function. By default, the intelligences we are creating can sound like Data from Star Trek, but they'll likely have more literally in common with incomprehensible Eldritch horrors. It is wholly irresponsible at this point to assume Artificial Super Intelligence is impossible by some extension of it not being literally "conscious".
youtube
AI Moral Status
2025-10-30T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwtxs-CncYNop_0tsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1yXpN6_mw1Mbo2jJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyB7ndzWaq0zAswosp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyExt2nRhtNP0DqbJ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxA2eJgnKVc_b_B4TZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUwm8CoQz08K_rFqV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgznTGQfXRmC1stpMRR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyMf4BIlZYdS76nVbt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmxwLGLl4MRC3aboN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4EwcGESnFAEd1Obp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]