Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol I will create my own community with the help of AI. By the way this doesn't …
ytc_UgyHtBGUJ…
G
NOTHING HAS CHANGED, if you just use Android Auto and not actually try to learn…
ytc_UgwxzkYTc…
G
It's not just AI it's offshoring. The same number of layoffs is the same number …
ytc_UgwHon8eZ…
G
AI is the new black, and I'm fatigued already. I'm just waiting for the self hat…
ytc_Ugyh-25U7…
G
Here’s what’ll happen to AI art:
AI draws the “Art.”
The AI learns, and takes m…
ytc_UgyTQNhUJ…
G
This argument holds no merit at all. Taking inspiration is not the same as a mac…
ytr_UgxcPa0H8…
G
AI won't live our lives for us, so it won't replace us. There is something about…
ytc_UgzW_dcD-…
G
CHATGPT lies. I asked it to check something, and it gave me an answer. Then I ch…
ytc_Ugx2b5RIu…
Comment
I don't think super intelligence will ever happen. Based on current architecture AI cannot be creative. It cannot develop its own biases, will or survival instincts.
It just can't. It's a literal impossibility.
And without those it cannot be superintelligent.
It is literally just saying 'this pattern of words which includes 'CEO'' elicits this pattern of words in response with x degree of certainty'.
It's incredibly sophisticated but that is all it is doing.
It is absolutely limited by the data you put into it.
What if surgeons or lawyers simply decline to put their most cutting edge data into AI platforms?
The AI platforms will go out of date.
The Use case, both positive and negative, is unleashing human creativity by dealing with the mundane.
youtube
AI Governance
2025-07-29T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx7LUxS61ndNp1KsGd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwrnqG7bf-H0ChdG2N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNzlfaf3R3PLIiCHB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgybPC58HmW4v4j-O6F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzOM2VQu2FwyR7J_fp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzivkIANy2OsWwvMMJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz3TQ28uvug40Fc0oJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqpjCEAnPRGajKrzt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwJUckVNZbkdZIR7AN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwL0NX3RvAs8BscIKJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]