Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Totally agree. This AI is the WRONG AI. Just more of the same as we have.…
ytc_UgyKS-NcG…
G
–once a legend sayed the world will be destroyed in 2026.....
And that's I think…
ytc_UgysC_ibt…
G
AI technology is being used to track and attack innocent Americans- why aren’t y…
ytc_Ugzd8qawF…
G
Forgive me if someone has already suggested this, but isn't it entirely possible…
ytc_UgzMbmdeO…
G
I refuse to use AI chatbots. I’d rather be alone than infatuated with a program.…
ytc_UgwJP-_W1…
G
With Ai mimicking deep human interaction, this just screams of automated spycraf…
ytc_UgygoCEFv…
G
Who's gonna make sure that a Chinese or North Korean AI aligns with human rights…
ytc_UgxqVxXqX…
G
America Middle class has been impoverished for years now, living paycheck to pay…
ytr_Ugylzk5U-…
Comment
Another long ad for "AI." No, LLM's aren't nowhere near general or even intelligent. Calling them AI is misleading. All these companies trying to replace workers with LLM's are already regretting it and they will regret it even more once the true costs of LLM's come through. Remember, OpenAI is still not profitable and even if everyone would switch to their premium $200 subscription would still not be profitable. It would have to be 10x that. I don't think that anyone would pay that much for an LLM that will hallucinate and make flawed decisions based on those hallucinations.
youtube
Viral AI Reaction
2025-11-24T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxIan7qHKHFIpjQLjB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyfph6yKb4wZtz_MXB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwkynk-nkFtvrb97qV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzO4aN7mbAf16Swd54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy3WSZuXv0-9Fa_6od4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwTMyO33GyNZCHoCp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzYY-19ZEF3ETq-ZOh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzBLQHdLklLjrOmYAN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwH106BfUhvtL3VISd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw1ylPONNqVknbJpO94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"})