Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Being polite might well influence the development of an AI but for me it's abou…
ytc_UgxSjOU1K…
G
If Ai really gonna using on art every time, real arts will be more expencive…
ytc_UgzjCEyAm…
G
I wouldn’t be surprised if that HD quality video of Alex Pretti kicking the tail…
ytc_Ugw7YVKtc…
G
I asked Copilot two C# related questions this week. It got both wrong. In both c…
ytc_UgyG-MYm6…
G
“blue blood” like we’re a whole different species that was just “born with talen…
ytc_UgxPOs6_9…
G
A day will come when resources cannot meet everyone’s use. I wonder if AI handic…
ytc_UgxwS4g84…
G
I have zero interest in AI, more of a pen, paper and brain type of person. I kno…
ytc_UgyijcHO7…
G
Even if the majors adopt AI to write scripts, they will still need writers to fi…
ytc_Ugw4Fi7xV…
Comment
amazon is going to fail in the future… I hated amazon ever since I got targeted to get fired for no reason… I would work there again for the pay but still replacing humans for robots is bs… I don’t get how that saves them money… they gotta pay who know what for each robot + damages and repairs costing more than it would to keep a person on per year… who knows statistically it’ll show in the future! I don’t get why they are trying to create more job for people out here yet we got people like baldy replacing humans with robots… soon enough those robots will turn against amazon just like the people…
youtube
AI Jobs
2025-07-02T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxzlj177vLh0wYebfV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzIlfnGnIfbT_ni27B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzaeFfcsG3cYevqpaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwnr_7Ire7CPUQkjwV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy97H41QnuMGkyaycR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxKqiKgRCTMGMv5jNB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx124sUSRAz7fJvyCN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxmnxORR1fMEmpr2W14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxx1vTyRxdS4teVuCh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3EQ0fooAbASF87wB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]