Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon is so smart. He never sleeps, never ages, doesn't care if people hate on hi…
ytc_Ugw4TWaSw…
G
Simple to understand.
One is just words, you know this person didn't write this …
ytr_Ugz2dGMF4…
G
Sad, back in the 70s to the early 90s ATT was a decent paying Union job. Now th…
ytc_Ugw1ZvDKD…
G
Just want to point out: when you sit real still and keep your head in one place,…
ytc_UgzLZif-b…
G
nah but fr, my dream career is being an animator and i swear if ai videos take o…
ytc_UgyBpptnO…
G
Sadly, not everyone even notices AI images. I've seen enough people swoon over A…
ytc_UgzX913MI…
G
Beware doctors, surgeons & all underwater brain surgeons: AI WILL TAKE OVER YOUR…
ytc_UgwogeS0n…
G
I have a 2020 Model 3 with FSD. How does the ride in the Robotaxi compare to t…
ytc_Ugye_QabA…
Comment
I work in data analytics and have worked with people that are utilising AI. You should hear them talk about how AI will do all those boring Accounting and back-office jobs. It's like AI is somehow going to save us from doing all these jobs and AI will save the world. I can't stand it anymore; those accounting and back-office jobs are the backbone of the middle class. I've asked them, what these people will do once their jobs are taken by AI. Their response is always the same "well, they just need to learn a new skill... yeah, like upskill". We have right now, in October 2025, all the AI we need to do just like customer service, accounts payable / receivable, procurement and some of IT. So why spend trillions on inflating the AI bubble? They are so fixated on AGI. Burnie seems like the only voice of reason these days.
youtube
AI Jobs
2025-10-17T11:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxKvLd31kqSU2O5U3R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzeHKLbDWjl_1H-bR54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-rGefX16HCyonBNR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyyIrkRcWdOmeKrXdp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx0SF_W2tn77fbOhgN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwj4dHcEcY--X-YfO14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwDXhjJMT7zoPAyAzx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-3n9vWwA9LFtJ0gB4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxi5DVfa0Sx2-eWf6l4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwDht1H7ZdsXXWbl5J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]