Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Uh problem........the driver gets out and says I'm alive....how do we know that …
ytc_Ugz_J7u18…
G
Im waiting for Nelly to drop a Remix to his song E.I but this time call it A.I
…
ytc_Ugx7qXS-Q…
G
The day I'm told by a customer to shut up and put the fries in the bag, when I w…
ytc_UgzzDRpLv…
G
If something outsmarts you, it does not matter whether you call the thought "ori…
ytr_UgxbRWaoP…
G
Its when they take credit for something the ai made for them. Straight up preten…
ytc_UgzKg5eGh…
G
So as far as I know, chinese cars are mainly sold on the chinese market, so to t…
rdc_mda03bw
G
I would love to hear an exploration of how ChatGPT would actually arrive at its …
ytc_Ugy9c9bUv…
G
That would be great. We will be going back to the good old no AI days…
ytc_UgyMOSkoh…
Comment
Work in automation and one thing I've learned about logical systems is that they are senescent and rigid in a world where there's too much change. Right now AI is not at the point where it can self reflect or have enough data points for proper executive functions. It's not that AI is phasing out jobs but merely shifting the labour market. Software programmers are in less demand to create but rather more in demand to augment. The reckonning is coming for these shortsighted companies. They'll have to shell out big bucks to pay programmers to reverse engineer and recompile AI code. It's harder than you think to parse through millions of lines of nondescript code and make sense of it.
youtube
AI Jobs
2026-02-08T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz-0tUImsHPOW0Whn54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxig-ilfcKHJZwsAPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwFdFyTVfcOe-DkV2h4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgxwVza5EhO28fyQrW14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwCvwMCosAQWzNdDkB4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgzLNDaq0uc1nDPRDON4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyrcLUqFrNkga8xXMl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_Ugw1bhKxt1kHXxVus6N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzL51axPPGjx0kogYZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugz4RqFHUdu2mSQGKO14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}]