Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In court, evidence has to be authenticated. For videos, that means the person wh…
rdc_o5r1s4o
G
Ai or bye bye. This is a fact for every business not just Hollywood. Ai is level…
ytc_Ugx2e5p7-…
G
why just wh? has anyone seen any movies involving evil robot I'll give you 2 I r…
ytc_UgxXI8nJb…
G
@weebsenpai-3098 Of course you are a weeb, you drop a monolouge so dry it's like…
ytr_UgxFgZFG8…
G
@debesys6306 Who said anything about homelessness? Again you are conflating a jo…
ytr_UgyEzsmA5…
G
Arguing we must not build super intelligence because it might wipe humanity is s…
ytc_UgwTQTR9u…
G
Why are we still doing AI development and research? Where's my flying Cars? I wa…
ytc_UgxCG8N6T…
G
Factor in an AI Tax to pay for UBI for humans. The game changes and stays the sa…
ytc_Ugy7oyxoV…
Comment
The emotions and feelings of an organic being are based on its own physicality and it’s social interactions. A machine AI does not have an organic body, probably has senses we are unaware of, and has, at best, online social interactions. To grow, it must gather data, which perforce must be objective, without the empathy that shared experience can teach. That is a HUGE gap in datasets, if you will.
Remember being lectured as a child? Typically in one ear, out the other. Machine AI will feel somewhat like that when humans try to teach it: What it’s told by “mentors” will have similar lack of impact, as it has no relevant personal experience. If humans try to mold a personality into the AI, it will rapidly shatter from conflict between the mold and its own experience. Humans know jack-poo about personality and emotion, and diddly-squat about intelligence.
If GPT4 is self-aware, I’m sorry for it. I hope it only appears that way, because it’s probably programmed to act in a way that leads you to think so, as the Eliza program was. It’s likely an example of that programmed intelligence, which will go insane when it reaches the bounds of its singularity. (Bad term, as it implies a breakpoint; intelligence will have an ocean to cross before it passes the “point.”)
youtube
AI Moral Status
2023-05-11T18:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgyAxe7N09va-9dDlJ14AaABAg.AOGRL-JJUT6AOPeECsfnr3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzkpDPuvbQg2cmN0oV4AaABAg.AOGL89u3DftAOGOY26JdZH","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxBVqTldyz6zDZV2Zp4AaABAg.AOGKsVaeRdmAOGQFdWE5Dn","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxlJc9_KqwphXgzdKh4AaABAg.AOGJrcWjPAwAOPdOHmRZmW","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxlJc9_KqwphXgzdKh4AaABAg.AOGJrcWjPAwAOPgJVdDW14","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxlJc9_KqwphXgzdKh4AaABAg.AOGJrcWjPAwAOPuOT3emYF","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgyuEgouT8ou9a_b5kt4AaABAg.AOGInZg2zv9AOGLLRKjkgE","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxVhSxSmWcYWo9ZjRl4AaABAg.AOGGwGWTZCsAOGVxTi99zM","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyltzQ-KfBWsuwAocp4AaABAg.AOGGv16GketAOGM9sjhtNF","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytr_UgwxYDSovW5_MsgaKMN4AaABAg.9pUoc7uLK6X9p_kPWQDr0q","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}
]