Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you read the article it explains how self-driving cars aren't meant to be rel…
rdc_dkep0lm
G
Harrasment a user only for use AI.
This only show the artists are a bunch of to…
ytc_UgyKT91Ub…
G
AI should take all the jobs and we shouldn't have to work anymore to survive, bu…
ytc_UgwBcS-Dw…
G
I think Reddit is perfect example of how to use AI - consensus/community learnin…
ytc_UgwSLzA3u…
G
I can relate to Alice. A friend replaced my art project with AI as well…
ytc_UgxhRTuiY…
G
Omg I was iffy about AI and Chat GBT, imagine being proud of teaching your child…
ytc_UgzHaxWGf…
G
With 99% unemployment that basically means that AI and robots will have to work …
ytc_Ugy0BZekC…
G
I hope you have a bunker too because who knows what will become of all those sle…
ytr_UgwaFsztO…
Comment
I think I've been really behind the curve because I only started using Copilot last year. Then my company got us to trial Duo and decided that it wasn't worth the cost. It's only in 2026 that they're suggesting that Claude should be part of every workflow. The biggest problem that we have now is that we hired a junior developer who uses Claude for absolutely everything without understanding the output, and it takes so long to review that it's genuinely making us less efficient.
youtube
AI Jobs
2026-03-21T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwaolCSCrbAmzmXPy14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzE5vuj-DLOECBsDzV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz_OZ-y_IxebN0dWOl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3cdAjIfc7VJckYAJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwX7qjzo1jevbrvHc14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1DARepnEMK3u2lPB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytrV3NOnPLyOb7RQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzgFuK7fSuKvYnKJAh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRrFppSFBbTWXiKO94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxlVaW17VPUUpRsX4V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}]