Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just watch Terminator movies, that’s what the future will be for humans. AI is a…
ytc_UgxNNo9HE…
G
They are just replacing them with cheap labour hand who are the ones the trainin…
ytc_Ugy_kRRZc…
G
it already matches us, at least chatgpt 4 did, and 5 might be much worse scenari…
ytr_UgxkUPsWx…
G
So a black woman discovers that her face is not as easily picked up by facial re…
ytc_UgzejMHPk…
G
This is general artificial intelligence. I'm currently working on superior intel…
ytr_UgxMKCjDo…
G
Here is the secret
If you use Ai just a little You have to accept the LOT ~ and…
ytc_UgwpLK34j…
G
👴🏻- for a second there I thought AI was no good but I approve of this 🙏🏻🕊🪕🤜🏻💥👴🏿🩸…
ytr_UgxS1saVg…
G
The prince promised to send them a check for ten times the amount when he gets h…
rdc_jzo2z8d
Comment
Standard UK management: chasing the hype to line their own pockets. 🤡
They’re so desperate for that short-term bonus that they’ll chuck long-term staff under the bus for AI that barely works. The stats prove it 17% of UK bosses are already planning AI layoffs, yet 77% of workers say it’s actually made their jobs harder and increased burnout.
Even when it fails, they just ‘AI-wash’ the disaster and frame it as a ‘cost-saving success’ to the shareholders. It’s the same old story sack the experience, break the workflow, and then act surprised when productivity tanks. They’d rather have a broken bot than a loyal employee if it means hitting a quarterly KPI!
youtube
AI Jobs
2026-02-19T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyfgAa8Rnq4NGvBUzN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhNQJBrwXADCo8C9N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw8uQW5j2ofKcOwDZZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzGlY3Qqmf7RJ1mEoF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx3SSt_AkZWuStg4yJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbLknEC7kd9zqmjPp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqAj1436q-UR-f10t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwZ4wcPNn_KJqEehfB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwrp5V9DFzidj-4YjB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0v_MitKIuj-4dhKF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]