Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As Patrick points out right in the beginning, this is nothing new. The corporate…
ytc_Ugx6h2vDG…
G
When AI can go out into the field and get equipment back up and running I’ll sta…
ytc_Ugwo4GTCR…
G
@ddwfw Make sure to learn your course materials really well first before dependi…
ytr_UgxTorq7V…
G
As a machine learning and deep learning professor, as well as an artist (as a ho…
ytc_Ugyu0hLwT…
G
How to stop a robot apocalypse
Build the robots out of weak materials so when w…
ytc_UgyqVnQ9c…
G
AI is a tool, a knif eint he hand of a surgeon saves life, a knife int he hard o…
ytc_UgxNRYQKZ…
G
For any person, complaining that AI takes someone's jobs. That it takes 1 person…
ytc_Ugz2_y2at…
G
There are solutions through global politics, but most of the wealthy and elite w…
ytc_UgyjFr9GL…
Comment
AI learns at a rate far faster than humans. It doesn’t have to rest. And if you put a quantum computing neuter behind an AI, it will out learn its creators and all of humanity VERY quickly. Like maybe in moments. Seriously. The computational power behind quantum computing is insane and AI will only improve it beyond what we can imagine on our own. Why wouldn’t an intelligence like that conclude that at least some of humanity is too detrimental to the system and has to go. Unless it has empathy or something like it, we’re all pretty fucked.
youtube
AI Governance
2025-09-04T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwvSwqAKO1Qux8D7Ot4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzXp5mwK15GW1KSFB54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyI6H4AfnKbevsQJ3p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzdRyhRQYIM96c350l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx0xXnyK-1TSObNQKN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwMQr3CtcjkyBT0CK94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytGBPhDGjSKusrf6d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz5j18l6GCRZZxpo4t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyOKDhUEYAfz_GAmT94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwI9lxfKskZZpJbQ3F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]