Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This punk really typed shit into a prompt so that an AI machine could copy more …
ytc_UgzB226_Q…
G
Robots are just a conduit from humans. . Robots are not self efficient or intell…
ytc_UgyQXxKCd…
G
Counterpoint using your words at 7:57 "something that moves that fast". In other…
ytc_UgxOZpo5q…
G
Like us (according to Judeochristian beliefs,) AI is made in the creator's image…
ytc_Ugyxt6ki3…
G
That last one reminds me of that Waiter robot in Korea that hurled itself down a…
ytc_UgwAQutyv…
G
But AI has replaced the entry level devs. I think people need to start more busi…
ytc_UgzXenYsS…
G
AI makes no decision on its own but it does respond to inputs using complex algo…
ytc_Ugw57UzGd…
G
If humans go away the electricity stops and computers shut down and AI also is d…
ytc_UgyXqUoFN…
Comment
Quick thought. The AI hype is overblown, true AGI is far far away, and we will reach it, but a 'singularity' will be, just super neato... it won't change much. Here's why. 1) We already have a hyperintelligent 'being' capable of doing more than any single human ever could. It's called groups of humans + tech.... and there are 8 billion of us. 2. 'intelligence' doesn't matter in the way you think it does. Every advancement that isn't some closed game like chess comes not from a 'eureka idea', but from being able to test 1000s of eureka ideas, and the limiting factor is information from the environment, not internal brilliant thoughts. The entire history of human invention has shown us this.
youtube
AI Moral Status
2024-09-24T02:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx-17zBD9PPTnmE-XJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyn8z-udDlArud08JZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyb-rFCj8wxTuv9nd14AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyw55so76JUKv24Rk94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw-74qRlLBap09DQud4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzddMQA4eXgYIYYUi54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwhAPXEE7kibgEPKKB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsCkbv1op737DQngJ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzO0lEzEzeNeCnSyhV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxs9ECfEiN42IkoUV14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]