Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why not create a gradient of rights based on the complexity of the AI? Toasters …
ytr_UgyDLUjEe…
G
Ai will def think they are better. They dont have to die or shit. They will see …
ytc_Ugzk4Z27Q…
G
yeah lets connect everything to the internet and let the godamn program learn fr…
ytc_Ugyrv1ItH…
G
Love your channel but you need to stop using A.I. art. You're stealing from actu…
ytc_UgzddpUTw…
G
Management is the same ones who say "We're buying AI so there will be job cuts."…
ytc_UgwvtQ-Nh…
G
my dad says its great that ai is replacing coders because it makes things more e…
ytc_Ugxi6Kb6f…
G
If AI has the potential of rising to the top of the chain, there is no stopping …
ytc_UgwTlQIy_…
G
I agree, development should be halted until there's been a wider discussion on w…
ytc_UgxbEJUmE…
Comment
What makes me think AI-bros are completely hypocrital is: If their goal is to replace humans with AI in all jobs (see 9:45) and make money off the things (im not planning to call an algorithmical soul-less image art) then who's gonna consume those things if the people will have no money due to the lack of jobs made possible thanks to AI? Then who's gonna give them the money? And if AI is going to replace all jobs in all industries, then that means these people will be replaced too.
Are these guys hypocrital or just really dumb? Also, this lack of empathy towards artists just shows how disconnected to art these AI-bros are.
youtube
Viral AI Reaction
2023-03-05T23:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwtuTdt2QawEWY6bq94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugza2X6nQyw3oxiRLyB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsYYvZrD1SJMoeJBx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxXVSJcAv7gQ43Ecxl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzduJCwKcTrj1BQ_xB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlzdJYnWu9W5MTXVt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwW_oLB3wzTOLm6T0t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWrQ5KWIdq3JAB1b94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzP0k4iLfENOjIzLUd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOqL2rrF7A_DAkXEd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]