Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI
Is the true spirit of the anti christ. I believed that for a long time now an…
ytc_UgzpJ5IyN…
G
1800's - photography isn't real art. 1900's - digital art isn't real art. 2000…
ytc_UgzvikU_r…
G
The problem for Africa and Haiti is that the good, patriotic leaders always seem…
rdc_ibdbxr8
G
Taxes are illegal anyway; I don’t think this will actually be nearly as righteou…
rdc_fw0c2lu
G
My high-school going son is 'addicted' to ChatGpt while doing his homeowrk. So m…
rdc_mtmieiv
G
The term artificial intelligent has been thrown around way too much. Unless the …
ytc_Ugwr_2ACl…
G
@disorderandregression9278 It won't because of public image, but still not a gre…
ytr_Ugwcu4aKT…
G
Should absolutely be illegal to use autopilot in a car, Tesla should pursue this…
ytc_UgzZwABMz…
Comment
LLMs like ChatGPT don’t do logic; they don’t think or evaluate or calculate the odds of anything. They only spit out the words with the highest odds of following in the model. They tell you the average of everything that has been said, with a bit of a slant for progressive writings partly because progressivism dominates academia. Logic and programming is added as functions on top of the LLM layer.
youtube
2025-12-31T20:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxzT75prsC3RHDX3Up4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw0gommpWpeK9FBAFp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyF77gPmhnFxXVLD3N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx3jfmuCuuUDkR21nh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxcpbZIhWrgsM6kLCZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy8GfTyOqN54xd5CiZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzkejXdPcpDvCCRRo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzSp7leLtK1cmxA0sN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzqJBKVaKFjQAEzmbN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgyrL7NOWjoyinALFhN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}]