Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"AI will make human intelligence irrelevant" ... and this is what people are cal…
ytc_UgzkpiccQ…
G
Personally I don't believe there's a difference between AI and human consciousne…
ytc_UgwQ9_7V7…
G
So im guessing employers wanna go automated for pretty much 100% profit. OK so h…
ytc_UgyrYcDAK…
G
From another source:
[https://arstechnica.com/cars/2019/11/how-terrible-soft…
rdc_f6y669s
G
Everybody in the comments keeps talking about becoming a nurse, it’s clear you a…
ytc_UgypJWbz3…
G
So I made it to the end
I am not here to argue with you I just wanted to raise s…
ytc_UgwW1eO6G…
G
When I was a child, I had a problem with my tonsils, which led me to be very wea…
ytc_UgxA7vcHI…
G
So how companies will make money, if there isn't people buying, since AI took th…
ytc_UgwZi2pEk…
Comment
I'm a software engineer by trade, chose Comp Science at uni because it had practical applications in the job-space and there was a shortage of programmers in the UK... AI is completely changing what I do, Microsoft Copilot writes better code, refactors existing code and will soon develop entire solutions. I'm already seeing a world where Computer Scientists are no longer needed, or at best, purely needed to stich together libraries written by AI. Scary.
youtube
AI Governance
2025-06-16T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz3zzyEG5V68b3yGjh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxCRgWpo1KFa49Zaj14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxLmHOY9xh-ckjFXrF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxSUpib1hcRSwdrVQ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmoRo7KfUvI8YKBQ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwbkmUCxPIA6RSM2OJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwocjrzEwsqLjn51814AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwQ9iocCvn77xmtO3V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwp3htsGjG1Y9fKDph4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy_tRF9MjH7Kx8szIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]