Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When you leave your self driving Taxi... another Person or Group one can use it.…
ytc_UgzRaZobZ…
G
@laurentiuvladutmanea Near perfect is not perfect, but regardless, the idea an A…
ytr_Ugw1RwJSy…
G
It’s good to see that some main stream outlets aren’t completely bought into the…
ytc_UgyaGy5_4…
G
idk mixed fellings about this one i think throwing cameras and paintbrushes toge…
ytc_UgzzhAh21…
G
accidentz happen, esp. in proto-types! remember 1st flying machines from Ireland…
ytc_UgywDqdIh…
G
@SkullFrunk67 It is messed up. It's the same as people that see for example war…
ytr_Ugyt9us_5…
G
Not good when ROBOTS have the ability to fire weapons!! Not cool..Dangerous! Did…
ytc_Ugw4aUI22…
G
"AI model collapse" was brushed over so quickly. Models are finished, packaged p…
ytc_UgymTdHJs…
Comment
*THIS IS FANTASY* You are not getting super intelligence with LLM's its not happening and its Cambridge University saying that - not me the guy on the comments
EDIT: They don't HAVE actual creativity, the closest they get is hallucinations and the better you get at stopping them from hallucinating the less they have creative-like abilities.
So the irony is the better they get the more LIMITED they get at moving toward super intelligence or AGI. A GOOD trustworthy LLM is the furthest LLM from AGI. It is an inherent contradiction in the model.
youtube
AI Jobs
2025-11-18T19:3…
♥ 378
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwz-loKL5Mi6HGcLB14AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz331mIwIbF8FNvlw54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwyGSIPRaZdjS9eetN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzHXvd06SEe8jqK8mZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzfImFYBVBFprpJzCF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy-K0S9RmcWPitcTZF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz_9CpNawPuS3z0sQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxvoJHoPxE8PQr2zJt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxhiSZopX1-b8OqBD54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzeRi8FwQ2Q-Dk6Fpp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]