Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The guy being interviewed doesn't know anything. If you train AI to preserve its…
ytc_Ugz6iwdnK…
G
I was sure this test will go ok as soon as I started this video. Curvy road is n…
ytc_UgwmoCErV…
G
The people who control AI will be the owners of everything by default. The rest …
ytc_UgxAUvYHC…
G
@ProfessorDaveExplains I think you misunderstood that Lex Fridman's guest's s…
ytr_Ugz5r-GSL…
G
And just think, Colossus is specifically training AI here in Memphis. And I imag…
rdc_oa518bu
G
i love this but i have to admit something i have used ai image generation HOWEVE…
ytc_UgxdsClpT…
G
man what the fuck???
Ai is already a huge problem, its already ruined many creat…
ytc_UgynOZgAX…
G
The Old Orange Dude told us the jobs are returning to the US to be done by AI or…
ytc_UgwSIptWv…
Comment
Around 4:40 you mention that there would be an economic interest in torturing AI into performing tasks.
How do you think there's economic interest in creating an AI that needs to be tortured when you could create a non-learning automaton that does the job perfectly (which you would more than certainly be capable of doing long before creating a torturable AI that does the job well). We used slaves before because we didn't have advanced machines, but as machines replace labor having an intelligent AI perform as slaves would be redundant.
The economic interest would be in creating new machines without AI, not in creating machines with AI that makes them flawed for the job and then beating that AI that we put there to begin with into submission. That just makes zero sense from a practical engineering standpoint, and it's practical engineers who will be deciding this future.
youtube
AI Moral Status
2017-02-23T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg6uOok2VP5QHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgglKdwIP2tvZ3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgiBj8trrN2T_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghgtcFB4IEzDngCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiWpbxpfu9p6HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj2C_TxSi954HgCoAEC","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UghHl86Xngak0XgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiVMj0Ws70W2HgCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjRmuxIb5d8XHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggGny5a5uCQDHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]