Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I will never use ai art, but your beginner art is better than anything I could a…
ytc_UgxW1Hpt2…
G
Wow now that's messed up!!! Here we go the AI are going to take over, this is th…
ytc_UgylO6Wor…
G
I hope people realize that you can’t drink data or AI. Hopefully before it’s too…
ytc_UgwrLtlZl…
G
We're not worried about the machines. We are worried about what the machines wil…
rdc_f8t5v5i
G
I don't need my job - I just need money. Give me a basic social income - and let…
ytc_UgxAWfvYH…
G
Alpha Go is not a good example, since it has been beaten. Article title: "Man be…
ytc_Ugz6qXKYo…
G
Never gonna happen lol. Price is way too high even on an international level, le…
rdc_jrzrh31
G
😂😂😂 it's about time s*** cuz all your women do is answer the phone with attitude…
ytc_Ugxze2eQA…
Comment
Problem is we have a world built around money,it costs money to make ai...I think if what everyone is saying is true..then essentially technology will price itself out because no one will be able to afford to have it except a very small percentage of people and then they will not be able to afford it and innovation will come to a stand still just like it always does..so at no point can ai take completely over and if it does then it will just make itself obsolete..we won't need them because there will be nothing for us to use them for and so on.. it's kind of like having too much food,you can certainly have it but you can't possibly eat all of it without dieing
youtube
AI Jobs
2026-01-09T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwkwbbEWyvelAdh3FR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFl1m_4UfmqYSy0it4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7f99_ORJx7MwL0F54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyIHF1Ia7vM0T6zLtN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw5Yp2QeCWP74rMKjB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyOWs9BNmiANPCDx2h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy0oHPHbWU6TPhG6El4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzddcH9Cb-n-2FTRzt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxYeWX38RBg0uolklp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykxpFfhfFtffi-UOF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]