Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon Musk co-founded OpenAI and has been talking about this for 15 years? Ohh… B…
ytc_UgypR5HIB…
G
Since you are a lawyer I have question for you:
What if OP goes public and bla…
rdc_of1nouc
G
We need to develop new anti-AI/robot emp gun lasers that disable any machine. Th…
ytc_UgwiweCLH…
G
For me what’s mind boggling is not the fact that this AI has this whole internet…
ytc_UgyiHb2G-…
G
Xpeng walks looks a bit unsteady, like he are having a seizure. Tesla’s robot wa…
ytc_UgysqRpQp…
G
@SchemingGoldberg the difference was that cars were at least a great analog for …
ytr_UgxEdimVk…
G
@KatherineAcosta20 ?? Corporations, billionaires.. literally every social media …
ytr_Ugz8uvqr2…
G
The one has the key he’s the owner.. so careful to lose the key coz the robot wi…
ytc_Ugw0K3Rvg…
Comment
I am predicting that AI based on LLM architecture will never achieve AGI. No amount of scaling of current architectures will lead to AGI. This brute force scaling approach is brain-dead and will soon plateau. A new architecture is needed to achieve AGI. Understanding and intelligence is more profound than LLM scientists comprehend. Tech bros are high on their own hype.
youtube
2025-07-22T12:3…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxz9NZDlIw9ZDEtxdd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzJUxgDlLZ-XAbJSRV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFrnx-7fOUUje6R6R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzI_JHeHkU51f2iljZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxa3QqsW-Jv0Z6D99R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyT2Y_0NdD5ssWXcIp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxSJzmx5Y9ASp5gpYx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5m86V0KykSfqACRt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgygPHyMa7Ka22kwneV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxXDRZFTpMQDZ-QOXR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]