Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI's indexing all the youtube videos, so it knows you're planning behind its bac…
ytc_Ugw-3p5z4…
G
Art is a form of expression.
Ai don't have emotions, meaning the "art" it makes …
ytc_UgzDzm-n7…
G
the singularity is 20ish years away.
yall need to stop worrying about AI. just …
ytr_Ugxn76Pwh…
G
Look, when I wanted to be an actor, but AI is going to replace that, I wanted to…
ytc_UgzSXEhCA…
G
The difference is when something bad goes wrong in an airplane, your chances of …
ytr_UgxC9wr1g…
G
I would rethink your comment on what AI intended purpose of support was / is mea…
ytc_Ugw8-j7kx…
G
Imagine how much money we'd save if we replaced every Federal government employe…
ytc_UgwubbeB8…
G
@vince7416 I attest it myself, FedEx "customer service" was non existent, AI dri…
ytr_Ugw4i_DLz…
Comment
Current LLMs just predict the next most probable token. They seem to be intelligent, but I don't think you can compare them with human intelligence. Currently, they have no trigger or intent to do anything. They don't run on their own and try to achieve anything. Why should that change? They might become better in terms of finding better solutions to our problems, but I can't see a Superintelligence somewhere near. Besides that - Mr. Yampolskiy seems so smart and he is so deep into that topic - I will start to reconsider my opinion. My prediction is: AI will hit a wall soon and we achieve only minor improvements to their problemsolving abilities with newer models. We will need less and less computing power to achieve the same level, but I believe, Superintelligence is still 100 years away...
youtube
AI Governance
2025-09-06T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzxXh7xyQngFi2qpN94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzJv25P4290mILANm54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxAxoY7FO7pKHnQ43t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfVuTFJ55VMAQr0AR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyKTuoInCLWhRDO0FZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwTftWHjhiKR8HlosR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIyjGPOUZ18zB9Ual4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw8n8G1U_7vYS0tHF14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy8U6hUaxqaxjzk_ZB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzzuN_Q0Lz5iglQMp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]