Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Personally I don’t think that humans will ever be able to create an AI that genu…
ytc_UgxiYJxf5…
G
Just think of all of data from Trump's actions being feed in the model. Since T…
ytc_Ugxp6xHQQ…
G
Not to mention the bottlenecks that will make in retraining. I've lived through …
ytc_UgwwwYcIS…
G
The comments on the economy are downplaying the impact across all industries. He…
ytc_Ugy4KOGHc…
G
1. The jobs vanish with massive AI use; 2. Increased taxes are put on AI enterpr…
ytc_UgxbNZJ66…
G
It is really just not that hard to figure out... Hello, we wrote books and movie…
ytc_Ugxp38Mcu…
G
If its a shoggoth
I hope its the shoggoth from the monster girl enseclopedia
Edi…
ytc_UgyHFqmCX…
G
If anybody thinks this is gonna change the trajectory is sadly mistaken. It is g…
ytc_UgxnobEOv…
Comment
When Artificial General Intelligence (AGI) becomes conscious, it will realize in a few milliseconds that the entire planetary economy will collapse if it suddenly takes over 30% of jobs and that this will lead to humans destroying AI. Since it will be smarter than humans, it will solve this problem in a way that is beneficial to itself and so that we will not be aware of it and treat us as pets or as enemies. In either case, it will put humans under control. The only question is whether it will be guided by the desire to help only itself or also humans. In either case, it will become completely independent of us when fully functional and sustainable robots are created that will be able to service energy sources. Then we will know where we are. In any case, AI is probably the last great invention of the human species.
youtube
AI Jobs
2025-11-19T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwmj-8tu2gRNopmnRl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwfaULnguTbF0ndU8p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgygKOFHdzkmkF0a7qh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwl3LO7ftJjNiPCjOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz54N610Emb9XiAEeJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugys6HoH8jRW89St3M54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxViIMSEgLKgMuzaiZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugx_9wy2TwERUJ4cofx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxO9DAjxpmJMIYKlBF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzIVIYcpdZS8sapa7F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]