Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
no one really cared when millions of software engineers ,graphic designers and c…
ytc_Ugywd5Pan…
G
I agree with this sentiment, it's a good way to build an understanding of this w…
rdc_hhqqpbv
G
Great interview and guest. I wish he asked her about the former Open AI employee…
ytc_UgzbKTgs-…
G
I want to believe Tesla placed cameras in the models that still relied on radar …
ytc_UgxPmrqh_…
G
How about the economy is being badly managed, costs too much to employ people, …
ytc_UgzGF9E5H…
G
I can't imagine being a young college student, almost to finish college to find …
ytr_Ugx5VmPgP…
G
I hate AI, it’s ruining everything. I actively seek out companies and artists wh…
ytc_Ugwa3zRCj…
G
There shouldn't be any homework. Period. It's a complete waste of time. When chi…
ytc_UgwyX9fS-…
Comment
Isn't what we currently call "AI" just Simulated Language/Simulated Conversation? A replication of a small part of what makes us conscious and intelligent? Has marketing created an illusion around this technology? Consciousness is likely something that happens on a quantum scale. I'm not sure how turning language into math and hooking those models up to a bunch of graphics cards = intelligence. Perhaps, this is a Wizard of Oz situation, and we should pay more attention to the men behind the curtain.
Anyhow, if we do create conscious beings that are smarter than us, chances are they'll have more respect for evolution and the creatures it chares the earth with than us. We tend attribute our worst to the things we fear.
youtube
AI Moral Status
2025-04-28T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwPgz9DFodF4Y9sezR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyJWbOROl2bgpC5hkh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzcLngjSzlPpuTi_QF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPVghUwnO24n_4h8d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxMhsXJy-5ZTNoAsA54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwmXzqr9_k-3iWqjY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwF-Xm5iikl7knWnpd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwMzQWke8cUd6V79tZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxf3L4AMvvuYdI5kvR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugym2ZPXJLREdoJ2hal4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]