Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
lol, when you asked chatGPT for a solution for peace I half expected it to have…
ytc_UgzejeMh1…
G
I use Ai to polish my English because I'm not good at it. Is that wrong?…
ytr_UgxhdV2bS…
G
Putin coming up with mindgames and geopolitical strategies only to get blasted b…
rdc_jcbphqg
G
On the deepfake topic, I wanted to bring up a place Atrioc has been working with…
ytc_UgyWFRjyy…
G
This OscarAI guy's actually pretty funny 😂😂😂
Never seen an ai user fight back s…
ytc_UgzkD0s4-…
G
We havent made any AI yet so to have them become conscious..
All we have is powe…
ytc_Ugxnhd98k…
G
Unfortunately those who appreciate art rather than just consuming it are a small…
ytc_Ugxhl6giI…
G
Some interesting points about contrast, but I feel like you could get much close…
ytc_UgzD_o_05…
Comment
Hiring a non-tech person to a tech company just because is HR/manager is a dumb idea all the way. AI is a very efficient copycat. Its great to read bulk data and replicate it, but its terrible at validating if the data they're copying is good/scalable.
TLDR: Basically, what I'm seeing from the IT perspective is that they wasted billions on non-technical middle management, which hired junior devs based on affinity and bs, then got poor results from it, and instead of going technical as when they started it, they blamed the bs people they hired and bet on AI, which ironically was trained on poor quality data. AI is not intelligence, neither sentient, and barely "smart", but a very efficient copycat.
youtube
AI Jobs
2026-02-20T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyuPk2wAq50-ipbRXR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIiStYS7fA8HPXDSh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyWBpXLkqiIwbbCYf54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxh5xd-7ur6cB-0hlt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFa3Ln_r9nMxtL6TF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlKrT8w_4pTmriAtF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxULFdpVxY1PpO5Egd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4yOuuRAij8zbickJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzuaVc0lLmlE38xz-t4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgznPciN2geRTwpsJF14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]