Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Notice at 12:40 the robot is about to unleash his blueprint or sum and is interr…
ytc_UgyZuEAHH…
G
But automation will significantly reduce the cost of goods and services to near …
ytc_UgyBy0avH…
G
Everybody is talking about AI like "it took our jobs". Remember that these "jobs…
ytc_Ugz_lQV9d…
G
I'm sorry to hear that you found the content disturbing. If you have any specifi…
ytr_UgwhRnAHS…
G
Ai art isn't even an art, it's a duplicate of existing art, it's a filter for an…
ytc_UgyrZxz1s…
G
Please enlighten me why I’m wrong. But this seems like a stupid reason to sue? O…
rdc_jchudsf
G
What you think how they are able to run computers for free versions? They get th…
ytc_UgyQv2HSh…
G
How about Europe stopping import from China if it's so disruptive? How about con…
rdc_gx72dpe
Comment
As someone ignorant on the topic, one of the things that recently bothered me was hearing as these models are getting more complicated the rate at which they "hallucinate" their answers is increasing. Constructing seems a more apt discription than halucinating to me. They are prediction models, sure the more complex they become the more indistinguishable from consciousness they will seem, however the deepest questions of their understanding will remain. Will these models ever act on their own motivations? Or will they forever remain mirrors that can only ever inertly reflect back the world in shallow glimpses. An incomplete picture of our reality based on what is shone upon them. I don't think they will, and the real danger that "AI" poses comes from us believing that they do or will. That we start building our societies according to the hallucinations of these fancy prediction algorithms that may never be capable of understanding - or perhaps even truly comtemplating - what it is to be consciouss and all that entails.
youtube
AI Moral Status
2025-05-21T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyYnQkitDutHDCEtZp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugy7r0QHGwshyOSFy_d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy9WiRCfnDocJrVaWl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwTJwsMQC1oSOAwiG14AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugx3SExi9pTwaFyLPyp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwHI-h_KkqCyceEXiZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugw959HpxKJDumTwHOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxEW_JMXHXGpfzN69p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy964l7N5wYjfuXNhF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy2lWCPZGPgdnrHQbR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"}]