Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It largely depends on how much hype AI companies can keep up. A stock value is b…
rdc_n80wgws
G
People bring personal problems to work, and are not always dependable. Elements …
ytc_UgxHi5tLb…
G
1. What you said makes no sense. Most of the supporters of AI images generators …
ytr_Ugw71kG18…
G
I think I heard of a similar result in which an AI trained to survive Tetris for…
ytr_UgwEZHBNy…
G
I work at one of these companies you mentioned and get a high salary developing …
ytc_Ugwso5QY9…
G
How did you find out that it was AI🙁? Or did you just notice later?…
ytr_Ugzd5Uir8…
G
The Black Mirror episode "White Christmas" is relevant to this topic - it even i…
ytc_UgjvyHpzG…
G
Thanks! I agree with a lot you have to say(and love Click). I just wish people w…
ytc_Ugw5THHD6…
Comment
The probability is that automation _is_ going to be an increasing part of our future. The trick is to target it to places where it can deliver superior service/work (e.g. not services where there is a strong human contact component to customer satisfaction), and find a way to use automation to enhance job availability (new domestic industries, for example) rather than simply kill jobs. That is doable with some public spending, but we have to ask how this can possibly be accomplished in a system bent on gutting social spending.
youtube
AI Jobs
2016-12-27T08:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgjpVTGIy0ORJHgCoAEC","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugh0oU1tNdPt1ngCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgjY_tbsCb1KOngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Uggw9rcQR5Y5hngCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgiT0tRlBs-8xHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_UghHYVJ8-6NV2XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugg17gv7xnc70XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UggGPXzlky_xcHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgjUjeipdDsakXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UghpFe5Lz1X4OHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]