Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They post this, but didn't seem to link it to how the IRS now wants to collect b…
ytc_UgyKwXiYy…
G
@Casal_Esnobename Thanks for the comment! Mike de aço kkkkk. So, if you had to c…
ytr_UgwdouC4J…
G
Cleo, an artist getting inspired and putting their own spin on things is NOT the…
ytc_UgxP2cef6…
G
What about the fact that such an economy is extremely fragile? Guerilla attacks …
ytc_UgxuuwgXL…
G
AI is in control now and you don't even know it, a long term plan to takeover, t…
ytc_UgzE70JUh…
G
Why are we pretending that this isn’t already running ? 21:37 isn’t that why we …
ytc_UgwM3nW9H…
G
Interviews regarding AI and its development are wild: non-tech interviewers ask …
ytc_UgyhLXy02…
G
Ai is only as good as the task and information we give it.
If it was trully thin…
ytc_Ugzyf7-wO…
Comment
LLMs are "very good" at generating text, but cannot know what is depicted on an image. Diffusion models are "very good" at creating images or videos, but they have no concept for the meaning of words. Other, completely different programs, have been able to puzzle together protein shapes or they can play chess, but they cannot speak/write or create images. An LLM can be linked with a diffusion model and image recognition software to "create and recognise images", but all the parts are detached from eachother. Not to mention that all of those parts are pretty bad at what they do and terrible at working together. I see no reason at all why an AGI should pop up anytime soon. Nothing that exists now even gives us substantial reason to think AGI is at all possible with cirrent tech.
youtube
AI Moral Status
2025-10-31T17:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxxymUfKqjpi9sTcu14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzXLtbFlreioGXT1Ot4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxu-ZYDcNKu_8SCRip4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwotKJwFt0L1l2tTyl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweIOcZTSJaU9FIX0p4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy51Nto5PFoFu4eWmF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-Nm9DTHQyJEKUWPN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwfbTS8Pu304Bz1vlB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4SAWrRpzU541fE-Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzcGuGOHX5Sx4Zybp14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]