Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Shiori0404 yea cuz making the ai is EASY
What about people who want a certain…
ytr_UgwBCvPVO…
G
AI doing what a man can do will be no different than what man can do by mind and…
ytc_Ugzd975JS…
G
These guys building ai, the ones who own and or run the companies and the govern…
ytc_UgxO99ptt…
G
Great video! I agree that AI is a tool rather than a replacement, and coding wil…
ytc_UgynXOBY0…
G
@MeepMorp-n3z yes. its just a digital picture. theres plenty of them on google i…
ytr_Ugx6-lurg…
G
The framing of this video misrepresents the nature of language in human-computer…
ytc_UgzDO3mNQ…
G
Only man who calls himself the most advanced of all lifeforms is stupid enough t…
ytc_Ugzkj7a6r…
G
He needs to do this. Canadian household debt is at an all time high, and the pan…
rdc_fn5kp24
Comment
When I use a LLM for engineering problems, it seems they have not inputed equations and "laws" that appear in my engineering textbooks. They just don't have all this info. And the problem of 30% inaccurate responses forces me to use 3 different models for comparison on problems.. I am sad that phrases in LM seem to come from popular literature and internet posts, rather than textbooks.
youtube
2026-02-06T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxF8JBtSW_ZnnkT8Zh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyo708vngfXsRhg9ax4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwnIqWfXs-RnIENVaZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOBpPx4uHHI1vidCt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRQ8ZjaLEQYp4R8RR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzeGQLinLJMO7c3a94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyArDcxhymD_mJVA4Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypehuOg61jYmua30J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxhcFgQXdkWFh7cxcZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx2bTwyX5xyN60rJRd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}
]