Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i saw a giveaway for ai art on twitter and apparently some people are using them…
ytc_UgxaVKco7…
G
I don't think the AI malfunctioned when it comes to becoming MechaHitler(even th…
ytc_Ugyzh1sG7…
G
I just got an AI generated ad for Chat-GPT midway through this. This is crazy…
ytc_Ugz52uVgg…
G
3:27 is just proof that AI "art" can never be as good as actual art. The generat…
ytc_Ugxg41rYX…
G
This will be the new model for future generations or parents will simple have th…
ytc_UgxbvX1TP…
G
People are questioning this? It is so obviously AI not just the consistency of u…
ytc_UgwxFm1zH…
G
I am a Tesla owner and a Tesla investor and will be for the rest of my life. I c…
ytc_UgzmUUeRJ…
G
Yeah, you can Google the articles yourself. So, what even is the point of these …
ytr_UgwXgbLLa…
Comment
How they talk about them in this episode is a bit odd/buzzwordy. Basically when you use ChatGPT and give it a sentence “I am a person” the sentence is “tokenized” into an input to a model. A really bad tokenizer would be each character is a token so the above message would be 13 tokens (including spaces).
“Get ahold of these tokens” like they say in the video is odd because it makes it sound like they’re pre-generated. Each LLM model would have its own associated tokenizer, but where you do the conversion between “I am a human” and its tokenized form + passing it thru the model + output detokenization could lower cost.
When you send a message to ChatGPT it runs its tokenizer on your input, which is run on some hardware. So tokens are “limited” because this processing has to happen somewhere & algorithms for tokenization can be more/less efficient.
youtube
AI Governance
2026-04-22T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugw74p-SZWXhLmxc8tx4AaABAg.AVrz6YaJ9_LAVsZ_cneu1O","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw74p-SZWXhLmxc8tx4AaABAg.AVrz6YaJ9_LAVtBm1oSq5q","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwOm45dO_GAdfVgNQx4AaABAg.AVrZJAemr9VAVst0rq6CI-","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyho4YAo19NPbyQ-wt4AaABAg.AVrR7e1QWsOAVrfAKKueRY","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyho4YAo19NPbyQ-wt4AaABAg.AVrR7e1QWsOAVrkvh1QBdY","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyaoldSKD4I6ePNLqR4AaABAg.AVrNNBxi0BDAVrjhQyjwXI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyfPdsytKUVZ6h6EXx4AaABAg.AVrL5aBaoIWAVs-hqoNNbI","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyfPdsytKUVZ6h6EXx4AaABAg.AVrL5aBaoIWAVs2oL7SWna","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugx3s9ARYK8prO-gkWN4AaABAg.AVrKlv9UZ_eAVupGvsTl-H","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugx3s9ARYK8prO-gkWN4AaABAg.AVrKlv9UZ_eAVvCEl8zZ9B","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]