Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know what. All I needed to watch was iRobot to understand that AI being high…
ytc_UgwJg-sg4…
G
LLMs "clump" words together that are used together.
The best, truthful answers …
ytc_Ugzceppbk…
G
you were right about AI thinking hands ate supposed to look like this
and if yo…
ytc_UgyUte-Ru…
G
In a future dominated by AI, there is no code. AI is the app. AI is the OS. AI i…
ytc_UgwrADplB…
G
It seems like the robot's design might have caught you off guard! It's interesti…
ytr_UgwebYR2Q…
G
Obviusly if the question is, it it ai or human they wont give you a picture of a…
ytr_UgyPOoCYx…
G
Mr. O’Brian’s personal perspective adds favorable value and understanding to me.…
ytc_Ugwzuy6-1…
G
I saw a meme on r/ProgrammerHumor recently which describes my feelings toward th…
rdc_lz5nzau
Comment
All leading AI researchers and labs explicitly acknowledge that LLMs almost certain wont become AGI.
This does NOT make them a dead end. All of the progress made on LLMs is extraordinarily useful for future non-LLM AI development. LLMs have paved the way for whatever comes next, and so are absolutely not a dead end.
AI haters complain about AI slop, but the real "AI slop" are these simple minded anti-AI videos that are so incorrect but lapped up by legions of clippy pfp dummies who have an irrational hatred of anything to do with AI.
youtube
2026-01-26T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzlEX1w3yvGnqlbSit4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzsYn-baN-vNCzxjnt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBfZJT-UmJKWBx9LB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy_v7KHAhkY6dTBN3Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnZy7whAUA_tuatlF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_U2D2QMa3l3cN1Jt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzObe7Q9Tlpnzl9rm54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQ-p3ZCUWJ_4BcOoZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyMHRovwGoEmGmiYCJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxISdJQfke3IZvow2R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]