Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
2:54 Bull CRAP! As a story writer it is almost impossible to use dictation with…
ytc_UgyUo0j-Y…
G
So look, I hate tesla and Elon, and I think they are responsible for weaving the…
ytc_Ugz8_nBj-…
G
I asked the same questions, when it got to would you hurt people it said no. It'…
ytr_UgwkZ2nzs…
G
polygon. please apply for access for dall-e 2. we have now uncovered horrors we…
ytc_UgxVNcXiJ…
G
FAQ IT. Life is boring now. I think it will be FUN to fight AI…
ytc_UgzBTb3sH…
G
Id like to see AI run any of the mills in the US. I dont think a computer will b…
ytc_UgzJoJ-wC…
G
Let’s not fully point the finger at AI for the mass layoffs. Amazon and UPS goin…
ytc_UgxBSWOB3…
G
Intelligence without emotions has no means to go rogue. Ego, desire, jealousy, a…
ytc_Ugzhj3ApE…
Comment
It sounds like you're saying that it would be a tragic thing to force an AI to do what we want it to, instead letting it do what it wants to, but that's a misunderstanding of how AI's work. An AI needs to be given a goal in order to start developing. Alignment is when we succeed at correctly translating what we actually want an AI to do into language that an AI can understand, and misalignment is when fail to account for something, and botch that translation.
There's no option to not give it a goal. There's no such thing as letting an AI naturally develop and choose its own goals. An AI could decide on its own instrumental goals in order to achieve some larger end goal, sure; but the end goals need to be provided by an external source.
youtube
AI Moral Status
2023-08-22T03:5…
♥ 16
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgzxjGdwwf06E-SFgft4AaABAg.9tfnmbkWhqU9tgL9bDO4gu","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzxjGdwwf06E-SFgft4AaABAg.9tfnmbkWhqU9thOSGT-4ea","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzzOqTI80kO-ARKGEB4AaABAg.9tfkpSLj9cB9th0Ihkoqje","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzC5pNScnLT4U-tLWh4AaABAg.9tfcr5Q1K6n9tgfYCGafxX","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzO9SqepC0hZmDOBo14AaABAg.9tfc_nSuukv9tgPo5Wo8X9","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzO9SqepC0hZmDOBo14AaABAg.9tfc_nSuukv9tiIaJ2ERTJ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzvhdmlyaprcjeJgo94AaABAg.9tfZ5jmHEv69tfgrynqswp","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzvhdmlyaprcjeJgo94AaABAg.9tfZ5jmHEv69tgc7HL18QN","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgztixwOLuoP78-87Hx4AaABAg.9tfAza775rq9thm2JXp9IG","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyClEpBrJ9n4DnJu-t4AaABAg.9tf0Typ8wwq9tfG546GubD","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]