Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love the errors AI makes as someone whos new to learning programming. I don't…
ytc_Ugy2VzdFi…
G
I think this perfectly explains what people mean when they say ai art is "solele…
ytc_Ugzzp2El2…
G
Is a truck driver a save lives every day I cannot see a robot doing what I do...…
ytc_UgxvhEe28…
G
When the AI bubble bursts the economic crash will the catastrophic for you and m…
ytc_Ugz0MeVNT…
G
They are not selling consultation, they are selling the ability to put blame on …
rdc_n7tpqpm
G
we didnt need ai to ruin everything, weve been doing it for centuries
it just ma…
ytc_Ugy90Zki2…
G
This is the best example I've seen about what AI assistants are and how one shou…
ytc_UgxtvY-p0…
G
@helpfulbot123 When it happens, we wont even know until its over. Because AI wil…
ytr_UgxeXNExX…
Comment
If Humans don’t wake up today then humans won’t get to see tomorrow, I personally don’t have any hopes because of how stupid humanity has become to even see what is happening and what those behind it are doing to our future as humans. For all you dumb people that can’t even understand what those behind the so called AI are using the AI to create a 500 million force of human being slaves for just the 1% super wealthy individuals and depopulate the entire human race on this planet. This is their agenda and reach by the end of 2050.
youtube
AI Moral Status
2025-10-13T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz4pm-KXvX40IvvS3R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyRh_QaJD4ZEL1nicV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxeAXAkZB68t3LRgUJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwvA-78rT4UxD15KtJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxHM9Fqq3t0Hq7EBdh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxGmmYgd_rLvuTPbHx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},{"id":"ytc_Ugzc0FmuqK4UWQntKYF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzuYgT342Kw7UZsajt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyH4WY8KNbYV6Q8MmJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxN_Iqa2gVSGQ0pgC54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"}]