Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1. Your claim is fundamentally false. The so called „AIs” work nothing like anim…
ytr_Ugy8NVPc6…
G
you wont stop the robots or A.I just by not buying from amazon. You would have …
ytc_UgxlamBwX…
G
I wonder if some of the AI bros are going through the dunning kruger effect when…
ytc_Ugw9n2GNg…
G
@junkie2100blah blah blah. Ai simps like you think youre so smart. We dont need…
ytr_UgwIYEkHQ…
G
A LOT of this video is based on the claim that AI art will disrupt the human art…
ytc_UgznUk9iv…
G
And still they just continued the development of AI. If your car is speeding tow…
ytc_UgwmgfQha…
G
ai cannot be inspired. its not sentient lmao. Do some research on the human brai…
ytr_Ugwx4TpxG…
G
You fools. Slavery is more likely to return than irobot being real. The AI will …
ytc_Ugw-8JZ8M…
Comment
lol people are freaking out over AI for no reason... this is one of those technologies that levels off quickly... kinda like the airplane did. If you'd been around in the 30s, and 40s, when airplanes were really starting to catch their stride, you would think that by now we would be rocketing around the earth in minutes. didn't happen. technology got to a certain point and leveled off... everything points toward AI doing the same. we're not going to get AGI, at least not with LLMs. yall freaking out are gunna look back in 20 years and laugh.
youtube
AI Responsibility
2025-07-28T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxB-9kHQwbb7fHG2tx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUxNG7cTPQbe9QnS14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzVAOcuRgaFEGIiD3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7qXS-QdBP6CbnljR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwdqz4enlNC3-_l9NV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgwcDNTd8ARakHq2EqN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyl796yAI3y8VaX1N54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyMCHd1mFe_BB4YCl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxhpL5Qu7cHfLBURaJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxyrii6Rx67PJQ8dG14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]