Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I used to watch Shad quite a bit as second monitor content. Dude had hot takes a…
ytc_Ugw3a85Hq…
G
>both responses make sense given the stakes.
Honestly I do not think both re…
rdc_oh3ee80
G
thats not even his art. the ai was the one who smooshed the things it took it fr…
ytc_UgyoUgT3G…
G
Is weird the fact that the face recognition tool tends to mismatch with color pe…
ytc_UgxN_LBL8…
G
Texas: the land where billionaire businessmen and millionaire politicians will p…
ytc_UgwsnOyp0…
G
Yes. You "poisoned" 1,2-10 images. But to really at least half-kill AI model, yo…
ytc_UgwUbO7Cn…
G
@prasdudI strongly disagree. Engineering will be the last job yo be replaced du…
ytr_UgyABuhX2…
G
no ai will ever b conscious, becuz consciousness doesnt just APPEAR when brains …
ytc_Ugy_EJuif…
Comment
im the middleground of coders, and AI has been very helpful for me to be more productive in the areas where i already thrive, i catch instantly when the ai is feeding me slop and when its not, so its been helpful for me, but i 100% think exactly like you say, this is not intelligence, and will never lead to AGI, the AI of today is just a very large index of data and well placed weights that matches your question with the highest ranked answer, its like a super advanced search engine that can pin point with accuracy what you are looking for, if it has the information that is, if it doesn't have the information it falls back and feeds you slop. TDLR: i believe AI crash is coming in 2026-2027, AI only helps users who are already well versed in their fields to accomplish something faster than before.
youtube
AI Responsibility
2025-10-04T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzLktVZGPqixTdk0nR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgwsouBVTRDJ441l7W94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgxrLXl_y1cl3Pt_h9t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgxWKiywf5cAQYT-owB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_UgzA3HCuNfoPJXa1Bvp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgyWvJ9MGarj-x6TNkx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgysJasIFASOjfVRC5d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx-GscrcP8WRAwHa894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgyNQYRnAdRhfWoNXSx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugw27QxlpyZN59ocpbB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"})