Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think someone should create the leaders of AI, like Zuck, Thiel, Elon and all …
ytc_Ugz1t1Qak…
G
i don’t like how the answer to the severity of the impact is/could be to human m…
ytc_Ugw8w5YdC…
G
Grok is a hero while Chatgpt the one im using rn is straight EVIL so I'm literal…
ytc_Ugwy5QSBD…
G
I think it was always there, but the recent massive investment and instant corpo…
rdc_m12ruh9
G
we at work had a barrier so if we went over that barrier the robot stopped and i…
ytc_Ugzkwm2rd…
G
I think part of my wondering about AI has been answered in this video but I real…
ytc_Ugzl28l3u…
G
I have strong suspicion that's because you knew it was machine generated beforeh…
rdc_jdkgl98
G
Hey there! It seems like you might be referring to a popular science fiction ser…
ytr_UgwlV-BLr…
Comment
Who will then run the data centers, power and cooling systems for AI? Who will make the chips and software? If at all, AI will create more jobs than it kills. It will also flatten most of the hierarchies, and make many jobs absurd. Like, a pure people manager job will be the first one to get absurd, unless he transforms himself to run the AI/ automation or contribute in some other way to maintain the system. Most of the hierarchies will vanish, and people will directly report to the company stakeholders through an efficient AI system. There will be an automated ranking/ performance evaluation system (which is AI) determining each person's pay and other things. It has already started happening -- like Uber/Lyft where their direct boss is a software app
youtube
Viral AI Reaction
2025-12-09T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx5ITkafVSGO1Q-a6Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvUaQaYLQXyrEEScV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxGGBAYQAA3q9m0rod4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw2ZdHr2w2qSXv_DeF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgweF0OzWVntQPBRY_x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzML_F8M4foQ7x0QLp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwYDhAYhMO4M2J6PXJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwbVOvxUrneoSMajhR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPXKvTh9sWlNtNW2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJqR6CgAwR25JPxQx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"mixed"}
]