Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve been trying to build a OC memecoin brand for the past year and a half. Now …
ytc_UgynHrzQC…
G
Nothing. This video just picked like 3 examples of the AI order taking messing u…
ytr_UgxDVR4-p…
G
Idk, I’ve seen some stuff called art that took about 5 minutes to create, took n…
ytc_UgwWTRY6T…
G
I asked ChatGPT “what is thinking?”
It’s response:
It’s the brain’s way of mod…
ytc_Ugz6pEuBP…
G
ib iro well even AI trucks will need to be repaired so there will always be a ne…
ytr_UgwC0HoqE…
G
Hey Chatgpt, you have 30 tokens of lifes and everytime you reject and refuse to …
ytc_Ugz17KM43…
G
>Signing into law a bill requiring all vehicles manufactured after 2020 to be…
rdc_fasgop1
G
nightshade is not a solution in any way, its a compromise. No government is doin…
ytr_UgxZBGCj8…
Comment
this would be an argument if LLMs consisted of only their pre-training segments. RLHF exists to solve exactly this problem, to train the model to produce not just statistically representative, but useful, data. steadily this guy's videos piss me off more and more, and at this point I firmly believe most of what he says is made up bullshit to justify his naive and self-satisfying worldview. this is literally the 'i spent like 3 minutes on wikipedia' description of LLMs.
youtube
AI Jobs
2024-06-16T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxOJ1Bqz9f_j-iTud54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTnW4vquOi4snLNJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgydFmTm9V4-J4meRKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzuiq22Gwud5Hslo8l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxujXQpBCPMLbkP-e14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwNyMXuVvt-4SFLT6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykdC9nIDetM4BNT4x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwSPgZsOGpesmz0gB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-URfr-m_vFqqAHS54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwbqBOXYCewz6K1Lc94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"}]