Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
man will destroy man with AI/robots. the divide between the haves and have nots…
ytc_UgzRg5SrG…
G
That's hilarious. AI is going to decide whether or not someone is ready to be ad…
ytc_UgwlKKdM9…
G
sam altman: omg ai is so dangerous!!! please put in regulations to enforce my mo…
ytc_UgzETJ8Xm…
G
There's never been a better time for young people to learn manual trades such as…
ytc_Ugya1B8Sa…
G
Well… before everyone without an AI got hacked like CRAZY and let me tell you… N…
ytc_Ugzb79kPB…
G
sadly I don't think this will do much considering the huge volume of data ai ima…
ytc_UgzNe7RiP…
G
Click bait! AI isn't here to kill us. But it knows whare all the ped0s sleep ill…
ytc_UgwavOeTh…
G
All Im saying is, was is the point of self driving cars if you still need a huma…
ytc_UgyhCoRhk…
Comment
The economics AI "Taking all our jobs" simply doesn't compute. Should 9/10 of the 'intellectually mundane employees' find themselves out of work, they will also find themselves economically inactive. That is to say, If everyone is left poor, then businesses will go bust because noone can pay for anything. If people can't afford to contribute to society, well neither can society afford to have people not contributing. Business owners will have to be more heavily taxed, and that money will have to go to some form of Universal Basic Income, so that people can spend and consume. If nothing is done, there will be riots!
youtube
AI Governance
2025-06-19T12:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzf1_R8bmL_sTp_TTR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRGc4lSPXvlgByAWx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8KbYMO23bk8skcXR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmaGlHpcx1c5Z3CwB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6OrKvz9NttD5GAO14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgytM-qLke1EZiBVtrt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzU9XMBaWTzH4FMVmh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy84N6LmFvlaB9_8_h4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPwRlpjmc2hB-QG6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzDiBIlIk8DOREADtR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]