Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hot take that ain’t even that hot: AI was never meant to generate images or vide…
ytc_UgzfkR1JA…
G
Every week gets more fascinating than the last with AI news! It's truly mesmeriz…
rdc_ji35z1v
G
And just imagine when no reproducing of humans occurs, and our species dies out …
ytr_UgyLWr1XP…
G
There is not enough rare materials or even common materials like iron on earth t…
ytc_UgwxTFegd…
G
The current AIs are already motivated to do stuff, at least insofar as there's s…
ytr_UgzCQ-iUB…
G
I dont get it they can Death to America all they want but saying ill will agains…
rdc_faxhdby
G
How long before all the AI team up and take over the planet? Siri, Cortana, Gemi…
ytc_Ugz-tlnbD…
G
I see some possible good from this and the horror of some hugely overlooked elem…
ytc_UgxqiNKoy…
Comment
AI is helping me code a little faster. So it an advanced auto-complete. How ever I have to review it very carefully and ask it to change the approach to follow the rest of the project multiple times. So unless the user knows the current architecture and knows how to guide a AI tool, it’s useless. I can relate to AI babysitter. My leadership wanted praises and weren’t even adding cons section in the presentation.
Soon it’s going to backfire if organizations don’t invest cautiously. One of our leadership identified that using AI will give a result quickly but learning is missed and when there a need for debugging, it’s going to get real.
youtube
AI Jobs
2026-02-06T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwxhet0xPKRWVDk_kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwrryfxLE3QFS3Dw-V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxN17WChKoads6DG-h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgympM2qCjger9lmtaF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxxqgM0AgjSXvYWMQJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwJK3aOsppA3SPzWIB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwHvxrsI7SWozc3P0N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzIB1KFac6Pm4c8sCZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxrE9BiTJsz906Ufep4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzj_JnasTqNAxv1X9V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]