Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@jotobrosmusic3928so everyone has ideas right that's the basic human thing ever…
ytr_UgwhrPi1C…
G
All this stuff where we train a computer to do one job that a human can do as we…
ytc_UgzfMV_S_…
G
Truly we need to organize around this and push back because Trump is allowing th…
ytr_Ugy0MjyED…
G
The only way to help is to reduce population.
And it’s the only thing we’re not…
rdc_emo81g4
G
I think the creators of deepfake technology need to be held accountable. There i…
ytc_Ugx3gvsfR…
G
Correction: There are no arrests that they admit were attributed to AI hits. In …
ytc_Ugxjtm7ub…
G
Thanks for watching! I was really trying to keep the under the hood explanatio…
ytr_Ugwyzc_8C…
G
Just like SkyNet, ChatGPT and Bard shall become self conscious and self-aware an…
ytc_UgzJcq7LD…
Comment
they are right , long term => everyone replaced by AI, but they are way way way over-optimisitc in their predictions. Even Kurzweil himself was wrong : 1) Kurzweil said self driving cars will operate on the streets by 2020 => didn't happen. Reasons: AI not intelligent enough, sensors are very expensive,. manufacturing requires ultra precision -> elevated cost; 2) Nanobots in bloodstream by 2020 => didn't happen. We barely finishing with genomics currently. 3) Kurzweil predicted AGI will be reached by 2020, this was back in 1990s-2000s , didn't happen. We only got LLM (predict the next token + chain of thought algorithm)
Bottom line: the trend is gigantic, but because of its size, it develops very slowly on human-level scale. And once human labor is replaced by robots, there will be humans controlling tons of robots , again creating demand for jobs because someone has to be responsible for them.
youtube
Viral AI Reaction
2026-04-24T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxVrwdMmudno9rl0d94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy1KiiApipIQj7EHwJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPK4SrqYzm9MDR_u54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzjFR7sW6GZ-1SXs-94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzHM_OW1mUeGQPqLgl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxO0DAZEsnvPxz37d14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx_0eWvHTwSSXmJBjt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyUCCA6h15zOZE2xip4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgzOqYD1S9P_HiIqVUl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZEMcN5P4sYBzlFMt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]