Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Never mind, this dude is about as honest as the media in general. He thinks the …
ytc_UgxU8GKV5…
G
For what purpose would someone skip the safety? It's not like the safety would h…
ytc_Ugyv5Kodb…
G
You're a professional artist if you can re-draw an artwork scene in a different …
ytc_UgzYK9J1S…
G
Professor Hinton identifies the threat better than almost anyone alive, and I re…
ytc_UgyjgmKIn…
G
Just wait till they combine them with Energetically Autonomous Tactical Robot (E…
ytc_UgxAinng-…
G
A "yes man" is a problem when the thing or idea it agrees and goes along with is…
ytr_UgzY-outV…
G
Could be that the amount of power needed to run AI will finish us off by complet…
ytc_UgwIWsHI6…
G
I'm someone who the "it helps disabled people" AI simps would probably point to.…
ytc_UgxfWpYZ0…
Comment
In the training set, there are jokes. In the llm algorithm, there is a level of randomness while picking the next word. This makes it able to take jokes and give it a spin, making them new jokes. Same as any other AI statement. They take your words, run them through a bunch of maths. This leads them to a region. Yes, region. Where they fetch new words.
What region? The llm maths is made up of tensors. Tensors are matrices and matrices are vectors which are lists of numbers. A 3d position is a list of 3 numbers, x, y and z. Put any numbers there and you get a point in 3d space. These numbers are your words. The llm uses lists of numbers with millions of numbers in them. All though you cannot see such a position, it is equivalent mathematically. So your prompt fetches words from a Mathematical region. The jokes are also in such a region. It is very cool and amazing, but it is just maths.
youtube
AI Moral Status
2025-07-09T15:2…
♥ 15
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzyiNhUzcNJA7j93Pp4AaABAg.AKM_3MTVLcLAKMg4R_9CV5","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzyiNhUzcNJA7j93Pp4AaABAg.AKM_3MTVLcLAKMm2qSvx4O","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytr_UgySjha1pNoOdBmCIIV4AaABAg.AKMZjmut2E7AKMcrSCPixL","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgySjha1pNoOdBmCIIV4AaABAg.AKMZjmut2E7AKMhEpZwUFG","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugz7ciIV90QP-y79ned4AaABAg.AKMZS958uN3AKN0B1D8Y-I","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzTf8OaVWjrhbvJpHR4AaABAg.AKLa4i7Vac8AKMeIuOfSGS","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyWwK8lNXJEt3Pl-1h4AaABAg.AKKrfxka-MkAQcyQrMP5ec","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgzDHTnDY08FLwUvM1x4AaABAg.AKKnpF8V3C4AKKyxAdIZlc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzDHTnDY08FLwUvM1x4AaABAg.AKKnpF8V3C4AKMaeADVdhz","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzDHTnDY08FLwUvM1x4AaABAg.AKKnpF8V3C4AKMbfcYVNYZ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]