Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Unbelievable honestly, what's next? Using my brain to create images, music, vide…
ytc_UgwUQGyZ2…
G
3:55 An interesting analogy I can make here to a different video game would be t…
ytc_UgxUvNQnV…
G
guys my buddy with no experience just launched an app all written by AI, he has …
ytc_UgzcmccVo…
G
My biggest issue is AI artists is all they care about is "efficiency". They just…
ytc_UgzCrsZQI…
G
Honestly im an artist myself and especially when it comes to architecture ai is…
ytc_UgyRwyTts…
G
Maybe Jason can paint a landscape that will blow your socks off with a canvas an…
ytc_UgxYsGtzc…
G
The “Black box” idea isn’t real, it’s a byproduct of someone who runs data table…
ytc_UgzLw_oUU…
G
Born too late to be a viking
Born too early to explore space
Born just in time t…
ytc_Ugydn61qq…
Comment
No human muscles needed, no inteligence needed... AI robots will repair themselves, so no humans needed at all. What will be humans good for? AI will not have to exterminate us, we will replace ourselves with AI, become homeless and die out eventually, or live like rats. We are basically working hard at exterminating ourselves. What is this good for? Thinking this will be used for good of all humanity is naive, this will serve only for small group of people at the top.
youtube
Cross-Cultural
2025-10-03T06:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx5AKpPPNUr3WImrCR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyBzmJtNM1D-rMKP314AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx95yPWzTZBjNf39yl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxLO3LNdTy85dsU2Jt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwzCmskRU3w-6GNVv14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugykqejru57pOEbgHLF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwSq3EI71MV2TKSByl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwbff8OwTfd5MleC_N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw-XNHRaAM04VJDFmx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxJG90W77wy5EIDO0d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]