Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Zane was a computer science major. He customized ChatGPT, with the language, the…
ytc_UgzpdS2Gg…
G
Robot 1: Fck you
Robot 2: Fck you too
Me: This will be the end of Wakanda…
ytc_UgwXXAITu…
G
Im crying. As i myself not a good artist, but art is my emotional vent. When im …
ytc_UgwFI1jHs…
G
ROBOT SEZ "I want to create a singularity tomorrow" . ROBOT IS PROGRAMMED BY…
ytc_Ugzs4Nr4q…
G
Xi genuinely ruined China. Imagine if they had transitioned to democracy. Chines…
rdc_idezfvo
G
That first one could pass as not ai which is scary. The second one looked ai fro…
ytc_Ugy4KwSZs…
G
@spacetraveller9399 Christianity is not the only religion there is. Maybe the A…
ytr_UgzBusDdw…
G
There will be chaos if Super AI starts inventing and humans lose most of their j…
ytc_UgydTo8n1…
Comment
Idk, I usually go with sonnet 4.6 for easy things, opus for more complicated ones. Sometimes I have things done in less than 3 attempts if I point it to the right samples „what to copy”. Then I prompt „do you think it’s ok that way?” And it goes with 3 suggestions, exactly what I had in mind but somehow it didn’t know that by the time it spit some garbage. Then I create a skill, it will save 15 minutes of work every 3 weeks but it took 6h to create and nobody knows if and how it works. One have to create jobs using AI, not destroy them.
youtube
AI Jobs
2026-04-15T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxSsUcXcDPhlRuqp9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzdMEzQ0ANwWIPI0_N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxC8llrVQ8uE-wLp7l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_Ugwy4Z78msIJU43SQFV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},{"id":"ytc_UgxaEPrTx2MWpT3HiuJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxTym8X3rXxQI7qFnx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwDnuGnU_CH6Cf1vsx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgywVGHB1TdP0tx94dR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwezH1BChlqPupKqph4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxpBzzHuKgMnWRn3Fl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]