Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It really is a weird thing about the US, on one hand they are the ones chasing a…
ytc_UgxsiWkyL…
G
So don't teach A.I how to lie and we will all be fine. Hmm? You know figuring ou…
ytr_UgyuM1vmi…
G
Well, for me personally, the differences are obvious.
If you don't spell out wh…
ytr_UgzmHRPnI…
G
Mankind invented Artificial Intelligence
but no mouse would ever construct a mou…
ytc_UgxT-DuDC…
G
There are a million things that can kill cancer cells and all of that is meaning…
rdc_g3m0iu7
G
1:19:46 The guy who asked, "How do I lubricate my sex robot?" must have been one…
ytc_UgxaN1J96…
G
Learns by stealing from other people and jumbling it together is how humans lear…
ytc_Ugz3eqV1R…
G
I’m really curious, would it be ok to use AI to like get pose references? Someti…
ytc_UgyKCtClx…
Comment
Highly depends on model. Using basic copilot with gpt is useless of course.
You program faster with ai, that is a fact. Since you either way often have to rewrite some parts of your code when you get perspective on what the base issues of your plan is, you just get there faster with ai.
youtube
AI Jobs
2025-09-17T05:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzbPMpKt1vP6rNq6-14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJO-Z77kYWTG6iYz14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzYrgCjh94rpWm8kVt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2VzdFi3iGAXkEqqN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzpcHC88sB5cde69m54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOa3wCZSE-r8hEL8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwB7JUKojCTmJnODLt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWJYiC_T2kIBKpgAZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwU1uYzz2clipebgFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzo_N4FIoWjYtNI66B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]