Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Interesting interview, he's obviously a very intelligent man and he's does highl…
ytc_UgwXZvX28…
G
Sophia, the AI robot, was created by humans with the intention of advancing tech…
ytr_UgxYgVzrw…
G
And not to mention that AI usually uses irl and anime art styles, while ppl arou…
ytc_UgyjTyt7n…
G
Love how CNBC also write articles that AI is not overhyped and now saying that…
ytc_UgzVLw6Am…
G
Human: Now give me my weapon back
Robot: How about I test it on you first…
ytc_UgwD7ze_R…
G
I already swore that I'd never use AI on my visual novel. Proud to say I have an…
ytc_Ugw3uOY3A…
G
We will have to kill more children for these morons to admit that driverless car…
ytr_Ugy8D6WDc…
G
When you create art, you're using accumulated experience and your fingers to rea…
ytc_UgxbsVz02…
Comment
I asked AI a question today (involving creating an acronym) and it gave me an incorrect answer. No biggie, it happens, so I pointed out it's mistake to it, and it said basically:
Why yes, that is wrong! Here is the correct answer.
... and it gave me the same wrong answer again...
;-)
Not saying AI can't be helpful and won't "eventually" get there, but I think it will take a LOT longer than some people think for complete, complex, and accurate solutions.
And for some things, it might not be worth it. i.e. There are better things to throw AI at...
youtube
AI Jobs
2025-04-01T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzozStM6shXRvLAvDN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyzOOVo4etqlzFx3_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6Cg7hqyQCDJB34VF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzDb4UpUzCdiDiFI2Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7dyPJc6owJEaRVod4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKAyzZ_Qufk_0QEyF4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzYTz_Ls68SEkJsSJt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz3VZW3ivkSkCTsKDd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyZCNRSztT5Coo1tvZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwg7iylEpj-t46a2kd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]