Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Autonomous weapons are pretty bad as computers and AI are terrible at making dec…
ytc_UgyUn52dd…
G
I lost a lot of respect for Alex in this discussion. Every time a point was made…
ytc_UgyS_zR4O…
G
@Flloralina Cry about it. Because you are a child.
P.S. You gotta be really hig…
ytr_Ugyt9us_5…
G
It's easy to be heartless when you're not in the crosshairs of "innovation." A l…
ytc_Ugy_IMa2X…
G
This is civilisation grade technology and the public need a stake in directly go…
ytc_UgxtYf4he…
G
Fan fact AI art can't be copywrited, also AI is kinda in hot water because of al…
ytc_UgwJXMsLq…
G
I'm scared of ai 4 that reson they can read minds were going to be done in 2040…
ytc_UgyExZJDO…
G
When this happens I hope this guy has an AI army cause he will need it.…
ytc_UgziNsWOV…
Comment
If you have to prompt all the time, you are doing it wrong. Use OpenSpec, make good specifications and start from that. While using ralph! Both are Skills so you can exchange with the LLM and ask it to trigger ralph on its own too.
It seems that everyone forgot the boring part of the job, Agile was a disaster for the tech community, everyone talks about their opinions but no one want to do the boring part of researching, documenting and creating specs. Sorry it's coming back
youtube
AI Jobs
2026-01-19T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwTHACpuNH3alMZ-At4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgyBCSaYB73YLKszb_d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyvruCjW_584YguWzR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxPiP0ljQ6nDspT13h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwwfLE4DTHG6aKqYM14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyO2So829NZqsYinYB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxbxoDntkTRGrw-lkp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgxvMpPopnv6lGrp3Ah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwdSCD56Lxs9Dtfzwt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzpHMTXZEUlnjrN1Jt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}]