Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Buy one, and in three years, the AI patch takes over, makes your life a living h…
ytc_UgxFXVNsP…
G
I get what Rick’s saying, but here’s the thing — I do have a point of view. It's…
ytr_Ugy0a6yrI…
G
So now, with grok 4 Heavy on the scene and Tucker went stupid, we can see AI mor…
ytc_Ugz4w05i7…
G
It seems your job is safe because AI is not going to replace you nor I want to l…
ytc_Ugx5hhOco…
G
They don't have consciousness, they don't even know they exist, they are just pr…
ytc_Ugzah4leX…
G
Are we really surprised that the AI would do something humans absolutely already…
ytc_UgwYoufLY…
G
How much would Universal Income cost globally? When ASI or even AGI has the abi…
ytc_Ugz8cyF77…
G
Two things: everyone says that AI is limited by information that is finite, and …
ytc_Ugw0LSoRj…
Comment
It really seems like a lot of the AI hype stems from its wow factor. The results appear to be human and that is fascinating to people. Of course, the reality is that is all from us humans, just scraped and shoved into a big probability model. I also believe that many tasks tech companies and AI proponents claim we need AI for could be automated with traditional programming methods much cheaper. This would require having a long term vision and addressing technical debt which isn't going to raise the stock price fast enough to support quarterly bonuses though.
youtube
AI Jobs
2026-02-26T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx09uuVgn2i12SbWI14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0i1-J_M992mamd2x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxAE4a1r2vSCMTW8YB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwxsKasPmsaodW8Cf14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGOVqQ5fiaHosfn494AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw_bs2RBnYwu4RvtOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxhhqMR-qsCp5u8DNR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJ3ikTo36tdVl15Sx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzQGvrMNYckuSTq_Id4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzmVAkRR_b9w9oStyV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]