Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI learns from humans. What happens when it learns the zionist, fascist, anti hu…
ytc_Ugw9NYDcn…
G
> Elon Musk went on Tucker Carlson and spoke about AI. He’s building his own …
rdc_jhbjdxu
G
I met a plumber. He needs an oxygen tank to get by every day because bacteria en…
ytc_UgwQaZmC_…
G
There is a huge fucking difference between using a digital medium to make art, a…
ytc_UgxdjvFvl…
G
For all our differences, if there is one thing the humans of the internet can ag…
ytc_UgzR7-Eev…
G
First robot female president due to her being brains. Vice president robot broth…
ytc_UgxUpIRFp…
G
Ready are not ? Will be common in homes soon ! Artificial wombs are next ! Deemi…
ytc_UgzgbkekG…
G
No amount of natural talent or passion makes the art good, it's the artist and t…
ytc_UgxJ2Qk-p…
Comment
25:50 — My issue with this argument is that it assumes that the new work being created will require the use of the labor that is being replaced. However, the reason that automobiles flourished as an emerging technology isn't because of its ability to replace human labor, right? It took over because automobiles were far more efficient and capable than horses, and had the obvious capacity to significantly improve the lives of consumers. But the reason that AI is taking off isn't because consumers see its ability to exponentially improve their quality of life; AI is taking off specifically because corporations see its potential to replace one of their largest liabilities: THE LABOR FORCE. This is why I think Neil is pushing a false argument, because the same corporations who are developing AI are also the ones who are trying to reduce the labor force.
youtube
AI Moral Status
2025-07-25T15:0…
♥ 105
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwS4nd-dahYvo51-tN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwhYRTnwLROXCytwpl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugys9mESqtFeN4AWjax4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugze2G3M1jBTvJcLbjt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzcR_FPYrGCAeGIx3l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydgUw5fcLyleCO-kB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxS3oCc2tf4UCGWSvN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZftnpXPNDAFjCdtd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0DgfrNnlI4Qp3Q1B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxiY7v459zrG2QcXo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]