Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
59:52 OMG!!!. This is super.....intelligence. it's beyond imagination of human b…
ytc_UgxHNnsXG…
G
The part that is science fiction is the idea that they can control these AI. The…
ytc_UgzV7hFOA…
G
It's really disappointing how people use the logic of "it's on the internet, it'…
ytc_UgxSKsLid…
G
Between illegals driving that don't even speak English and cannot read road sign…
ytc_Ugzkd0L3z…
G
I have a simple, probably stupid solution but who cares…
Tax all the AI compani…
ytc_UgzVS2fcx…
G
>It's time to admit that either the technology doesn't work or that it's too …
rdc_jv6b6es
G
Steven, I think your next guest should be Engineered Arts, robot, Amica. I think…
ytc_Ugyodll2z…
G
We appreciate your interest in Sophia! If you enjoy engaging with advanced AI mo…
ytr_Ugw_u8W9G…
Comment
LLM is a bruteforce approach. You got current emergence effect because you fed decades of public Internet data collected and labeled into cloud computing infra, and powered it with energy equivalent of dedicated power plant.
Next emergence effect may very well require power input of the whole globe, and all sensory data collected (and not even labeled) which doesn't event exist yet. And then may be you hope for Gen AI effect.
Meanwhile, we got humans working with brains consuming electricity on the level of a light bulb (ok, and nutrients from bio-chemical reactions of feeding) already having Gen AI capabilities.
To sum up: LLM approach and linear brute force scaling happening right now is a dead end.
youtube
AI Jobs
2025-11-18T19:5…
♥ 58
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8dk5g50z2TEoZ21l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyAxzBe5ra_cSCoQGx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyWc4SHjiQEv1KhKb14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZ3A03Bi0hAaHBqw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwASeR1nIJRq_lSjcF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxuKo53ZP8c3QkkGmV4AaABAg","responsibility":"elite","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwwPos9M0slXLT_nFh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwmfKn9OiN3FgNOPU94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9Bzho-OamMXKhFm94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugztite_u8gThriWXfN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]