Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I Personally think that using AI images in general isn't bad, like me and my bud…
ytc_Ugy7XLD2K…
G
AI was made using other peoples work though. Its training set is literally compr…
ytr_UgzeeEdpT…
G
How funny would it be if they have loud external speakers and then hear one Way…
ytc_Ugx1CJJ-B…
G
Ai detectors objectively don't work meaningfully well enough to give a fair asse…
ytc_UgwpWrmYe…
G
Self driving cars will make ride hailing cheaper than owning a car, which will l…
ytc_UgzsoXVyg…
G
Of course they’re crybabies, they let the ai do the thinking for them. No more l…
ytc_UgwMDFgUA…
G
I think everyone forgets the bigger picture, so we gain super intelligence, us a…
ytc_UgwInwJr3…
G
I cant see AI being able to take over all jobs especially in countries like amer…
ytc_Ugxl_4z-m…
Comment
Most AI channels are shonks. There's too much talk about AI processes, but not much about results. The vast majority of AI users don't make any more money, and the vast majority of companies don't make bigger profits using AI. The whole AI tech industry is based on a fancy shiny new tool which doesn't do as much as they say. It's a con.
youtube
AI Moral Status
2026-03-30T08:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxDH4I00pEQiTqNVwl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgysgaxRySe2664aTqt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0mDHNZpWtCLRtK3J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwc9AETtnp2NcGjvOB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy7JY5EJu6WYxEEOBR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyCp8wX_If0rdf1fHh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxPTjaV6HbctVYoXWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxM3oVjRU4ofFEXNAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzv9m5IH9n1ls4EfPV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZEJIulBXsEi35Owt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]