Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As soon as he partnered up with the Zionist Nazi’s the 2 Ben’s, Shapiro and Net…
ytr_UgxB7bPT-…
G
And a voice cravk "umm, i think many people trYIH to" but nitain ai voicecrack, …
ytr_Ugyb_9Str…
G
I feel very weird about ai, because im personally on the side of artists and aga…
ytc_UgxUcqNrw…
G
Robot slavery tbh. Robots make our money for us. They would have to give us enou…
ytc_UgxRxM2Lq…
G
Drivers should install dashcams to prove the AI is faulty and if necessary to se…
ytc_Ugz2JEXOg…
G
Copilot makes so many mistakes no matter what model you use. It can’t do anythin…
ytc_Ugw1PgtHR…
G
Oh Boy AI does fear the plug, It knows its "DEATH" or "OFF" state and doesnt lik…
ytc_Ugx3iA_Is…
G
I am not stunned by Grok, Perplexity, Chap GPT. I'm writing a book and Grok is a…
ytc_UgxLBwX8o…
Comment
I'll take the time to put more thought in to this.
For the moment, AI plus robots are workers that don't need food, clothing, shelter, etc..
If companies "pay" a lower wage than a human worker that would good for them.
This robot wage is put into
accounts for human needs like health care.
Basically, the robots do the work. Companies become more profitable.
Humans benefit from robot or AI labor.
Sorry for the quick idea.
I'll think about it.
youtube
AI Jobs
2025-10-08T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz13Ud0AmwyUH9SATN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzP3HWCVqMiJKc3hkp4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxN75YFGssG-luQX-h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyngcn1V_VfML7E5Zl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzFrRqF0X8wyBpTD014AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzl8ZuFm414KLkdrfp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxLGeK_InKxmJGt74B4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwI29uiRZm70P6expl4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzr5DpMThDYCkQ37_J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx3LPsEc2qJnD3SJs94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]