Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Schools are useless ai is the future you teachers only created worst people tim…
ytr_UgzoF-gFn…
G
Ran right thru a stop sign!!!!
Why buy Tesla autopilot if you have to pay atte…
ytc_UgzPWozF_…
G
The paper you are citing says nothing about philosophical terms like thinking or…
rdc_n7yxivd
G
@disorderandregression9278that's not the culture. The culture in the community …
ytr_Ugz0MXRCW…
G
as a disabled artist, the disabled argument is insane to me. when i used to have…
ytc_Ugw0nR6ir…
G
They asked if AI could off humans and I sure hope so. Look at us. We cant even s…
ytc_UgwmCgtPN…
G
Imagine a person combing through a beach for specific grains of sand that meet s…
ytc_Ugx6rvcct…
G
They are being modelled from humans and learning from humans, of course they are…
ytc_Ugzdx3YKn…
Comment
There is one thing that truly separates AI from living beings. And that is numbers. What I mean is that AI cares about logic, statistics, numbers, facts and results. If it gets the job done then it doesn’t matter how much it costs, whether it’s human lives or money. All it cares about is logic and reasoning, all it uses is logic and reasoning. It can’t feel emotion, it can’t listen to reason, it can’t have feelings. It can’t make choices that go against the data. It can’t make choices that go against logic and it takes things far too literally. Which incidentally makes it cold and heartless.
THAT is what separates AI from living beings. And THAT is why it’s so dangerous.
youtube
AI Governance
2023-10-23T21:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwdPrQuPo-8Ui0gKX54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzb1YzGUG_BHOzx3CB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugys0G-jsu1yRnbQJQd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlR48IAUadc35dlDR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxNGqF7Mp09oAZsmjd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxZKfxx7sWa_2mNtDR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztiBG846xNu1HUjQV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLh6OoYiDu8dRtFF54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzhOtLx-xHVZUZP5lt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyX3n7V02kq9zIwlN54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]