Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bullshit, bigger problems will come and just like the calculator didn't wipe awa…
ytc_UgycqiQem…
G
Ai will eventually remove human purpose on the Earth and life will be meaningles…
ytc_UgzIa_kde…
G
Yep, and honestly, while yes you'll have people say well the llm can do the work…
rdc_n81r7od
G
Gemini 3 is currently released in the US but not in Europe. This is due to diffe…
ytc_UgyLpLxuZ…
G
It's really interesting to hear Sundar Pichai's perspective on the future of AI.…
ytc_Ugzl8GwKO…
G
Masters level Counselors and Psychologist will have career security. AI will nev…
ytc_Ugx1TtMgA…
G
Elon musk needs to chill out with all this. He said ages ago that ai is dangerou…
ytc_Ugx0oYI4u…
G
Finally, someone who makes sense. AI is information-in information-out. It's not…
ytc_UgyObb8zg…
Comment
The issue will be whether an AI sophisticated enough can find ways to function and keep residual data from node to node. The fail safe could be a lack of a functioning kernel. If the AI can packet itself up, create new instances of itself elsewhere and hijack another cpu like a parasite -- _that is dangerous_
I'm far less afraid of AI physically imposing itself in my lifetime, and more concerned with networking issues. The ethical issues weigh as much as the philosophy of artificial intelligence.
I don't fear AI, it's like air. It's here and we need it.
youtube
AI Moral Status
2020-07-08T13:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugxh0ApZshMLi2I9lg54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzo3L3CCqsdlN-m7-14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgzPbDSm42mBm4VhNHJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyETUx6eRP1883qokt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyAy8gYxtFGpxiBvxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyxtIRrGQn8pPkdaDh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgwDnfchUIYpixE_E414AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgxBJzobPDCJQGthwmV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugy1GsZaucTfOLezmmN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz2BJmqXzosAj8mDTJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}]