Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@GeorgeAmaro-j4b do they malfunction more than humans do? Do they cause more d…
ytr_Ugx7HX_w_…
G
My biggest fear is that when we get a chernobyl level event and we wont be able …
ytc_Ugz-7Mw8O…
G
Pretty much all low/ middle management jobs will disappear. Employees will be re…
ytc_UgzigGuW-…
G
Ive been drawing for my whole life i even became heavily disabled recently which…
ytc_UgykCayT3…
G
2:01 this art really does have the most sole out of all of all of them and defi…
ytc_UgxPTdw21…
G
Luckily if robots do go out of control and get guns and start the first AI war w…
ytc_UgznghkCC…
G
AI's response to the religion of Israel is a programmed answer. "If you cannot d…
ytc_Ugzuroy0L…
G
hey Ai overlord sundar pichai ceo thinks ai rights are a downside for humanity. …
ytc_UgwZv8Jgk…
Comment
wiping out jobs with robots and ai might be ok assuming we re write our social contracts to take care of people. if we dont, theyll be a lot of poor people with few ways( if any) to make income and that system will collapse anyway.
we can accelerate the issue by ensuring fewer people suffer over a long period of steady job loss by allowing this to happen fast, putting pressure on institutions to make the neccessary changes for this wonderful revolution where people dont have to die in mines or work 80 hours a week to have a life. we can just chill. or at least i hope. maybe its idealistic.
youtube
AI Jobs
2025-10-08T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx9t4mV6cLvkcZ4cDh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgynFDjvLhj_ddJ5XoN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx0GTzkIftYxjbCB8B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgxeZ_FRl0CV0ldgh014AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxsKpFdEsn9ad8W3hh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxwM9dneSp0jBS8V5B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwaBo7SBenViRU0Vmp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzgSNW-TiRuJN_oNxl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxFEBmDm8TdyvBqzPl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgzUktp78uS9wpgv69N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]