Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the biggest problem with this algorithmic approach is that you have a basis for …
ytc_UgjjbdybE…
G
Dr., wires are not the only things going into and out of AI. GIGO comes to mind.…
ytc_UgzJpyZfP…
G
The 70-90% stat is the buried lead of the entire AI industry right now. Claude i…
rdc_o9yxhay
G
The moment he mentioned the turing test i realized he is full of shit. Anyone …
ytc_Ugw72c0Qx…
G
This AI is literally the antichrist. It's trying to be God. It's trying to be ab…
ytc_UgxNDYOdc…
G
Nah im scared of humans and vent to ai 😭, just made sure to never share specific…
ytc_Ugx4Fw-ME…
G
If AGI gets to the point they can be surgeons, lawyers, etc, and most, if not al…
ytc_UgyU-53Rz…
G
@zcorpalpha2462 Explain what?
LLMs ain't taking your jobs. They are dumb, and …
ytr_Ugz6SFYvj…
Comment
I seriously cannot comprehend how ANYONE can even being to predict a date for AGI. On the one hand...the BIGGEST unanswered question in science RIGHT NOW is: What is consciousness. And the AI community is literally saying..."We will answer that question within 5 years." And it is CERTAIN that THAT is exactly what they are saying... because an LLM CANNOT instantiate human identity WITHOUT somehow duplicating the meaning of consciousness (or the consciousness of meaning...or whatever it is that describes whatever is occurring in these fundamental areas of reality that nobody has any understanding of).
youtube
AI Governance
2025-12-06T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzqKVz_1i3pxWepUQJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxpKZW557AxzA_bJih4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzpz8nFlj8b8tq2Pbt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwHI3YS3HB1MBlB8Ut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxkT33H6fUbd4oINk94AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy5LwgiHd-s0IHnmot4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwz-finEnPgcSVT6J94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzJexL1SdjrYowVtAV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwGFyBRLEhWOUxykB4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwJfTI7NT6MbjZU9tR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]