Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We're glad you found the video intriguing! If you want to delve deeper into AI a…
ytr_UgyXhUR9M…
G
sorry, but i dont think its okay to pull off these kinds of things, and for the …
ytc_Ugzrn8BoB…
G
how do you think YOU learn to make art, same way, your brain takes in the world …
ytc_UgwcsFzko…
G
On the Day of Judgment, God asks you to give them your soul. Sovereignty, Sovere…
ytc_Ugx4nYj7E…
G
The proof of this being wrong is that AI agreed to generate this video 😂…
ytc_Ugw_JsH37…
G
Developing AI might be one of the greatest scientific discoveries yet. It might …
ytr_UggSkZsWg…
G
The only way AI would "destroy" us is by our own mistake. Who are we to hinder t…
ytc_UgwB4Hphi…
G
I don't really trust Waymo's self-driving, but I trust Tesla's even less.
I gue…
rdc_nsyn7u9
Comment
Many jobs may not be amenable to automation, but when a large portion of a bunch of jobs in the office can be automated what do you think is going to happen? Through attrition and direct layoffs they will cut the work force. Run a scenario. Let's say there are 50 people in an office to make it realize its mission. If AI automates 50% of their day-to-day, you have 50% of your payroll not producing anything and the added cost of the AI service. Why wouldn't the office lay off 25 people or more and tell the remaining people to pivot and take over the tasks of the laid off people that were not amenable to automation. Very much like the great recession. The motto was do more with less and be thankful you have a job.
youtube
AI Jobs
2026-02-24T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwBS7UdMtu0yICkqNJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwevxEc64EA9CXhd1Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyyiqhWhsSfCMAJtNp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-yo_rkJq9euG3jAR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugylpb8auxiwfYGoYH94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwt-0ZzjBoXqq3N2BN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLRScyDbmWmXkseAx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyOlBcXwkQb0rd7nwh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyTHoM1Twk1x1qqxfF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwETB4fe_wqGfMrY114AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]