Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its how the oil companies operate. They all have projects shared with others to …
rdc_oi356es
G
If you would make good movies like in the 80s this would´t be a problem. if Ai c…
ytc_Ugz81MVet…
G
The down side of human design involving AI, is that the human GENOME, is Contami…
ytc_UgyZ1JWpT…
G
Did you see the first robot give the crazy one the go signal?!?? Whoa bruh!…
ytc_UgyGBQvH4…
G
Well, something did change - GPT-5 is a MoE of tiny expert models, way too small…
ytc_UgwkeTvW3…
G
How the hell are these people comparing AI and digital drawing?????? For AI you …
ytc_UgyL6ds7Q…
G
Even if social media will try to block people from posting online about all the …
ytc_UgxYHYpio…
G
Let’s be honest…. The first thing most of you were thinking was something sexual…
ytc_UgxzukZV-…
Comment
Maybe the Matrix too. If the machines took over, we humans would have no use to them, maybe only as a living battery. Though I'm not sure why they didn't just figure out fusion (or nuclear) power, or wave power, or something, I am not sure why they needed human battery farms in that movie. I think the original script made more sense (before studio executives changed it to make it more simpler for audiences). The original concept by the Wachowskis was that the machines used human brains for their collective processing power (as a massive neural network or supercomputer) to run the Matrix simulation itself. The human brain, in this context, was an incredibly efficient and powerful processor that the machines could not replicate. And Elon Musk might be heading down that route with his neuralink. Beware.
youtube
Cross-Cultural
2025-11-11T22:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwWf_vMgcwgjq3HzlN4AaABAg.AP4aZVJ79p6AP5UGPGDzq2","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_Ugzbg3ZDAE5D5YSJCp54AaABAg.AP4Tbi_vrPSAP7h96scrpo","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugzbg3ZDAE5D5YSJCp54AaABAg.AP4Tbi_vrPSAP8638-_FS_","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugzbg3ZDAE5D5YSJCp54AaABAg.AP4Tbi_vrPSAPAE6p_8AlQ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugwz57HVGDFHjxv3pKZ4AaABAg.AP3qVPtUBLvAP3rLdtlR-4","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgxXIN49kQ2b92FzJoV4AaABAg.AP0sQnO0SZwAP70WDW3lVH","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgwT19YFnHtu_As5pJd4AaABAg.AOvWxPxwmLvAOw-S8J5m-G","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyfLLzcefHmQLSo7Fd4AaABAg.AOvGsKS7f6pAOvSi2Owg_z","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugw_PzwZR0Wrfepf8Kx4AaABAg.AOsgTnjrQc1AP4j5gNIWQB","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugw_PzwZR0Wrfepf8Kx4AaABAg.AOsgTnjrQc1APPEp7R3HG3","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]