Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think theres a lot of people a bit insecure about this. AI doesnt stop you cr…
ytc_UgwEq3GI4…
G
I am a lifetime painter and I have often wondered if a robot will ever walk unev…
ytc_UgyFrpR6x…
G
😇Great comment! Absolutely right. I too have learned so much over the years from…
ytr_UgwlOnX4U…
G
Remember: The only jobs safe from AI are the jobs in which youbdetermine who doe…
ytc_UgyKwXC8-…
G
the thing is, if all artists do somehow quit art, (or rather the digital/online …
ytc_Ugy4vgvUA…
G
@disorderandregression9278 Im a professional 3D artist, i often use local AI to …
ytr_UgzkC895F…
G
Blue collar workers think AI will not affect them. Wrong. Manual jobs will be th…
ytc_UgwASQodL…
G
If AI is to be our "partner" that means it will have a say in things. Hmm 🤔???…
ytc_UgwQE4GjU…
Comment
We are already too far down a way of no return of exhausting all our resources and dooming our species to destroy itself, worrying about something like super AI apocalypse because a chatbot is good at tricking people into thinking it's intelligent, when we are 50 or less years from the breakdown of global society, is just stupid. Also, why worry about it, like, now? Nothing has changed, no advancement in AI was made - just more money being thrown at an old concept of LLMs. Why not make this book 20 years ago, when AI was beating chess grandmasters? Why not make it 10 years ago, when AI took over daytrading on stock market?
youtube
AI Moral Status
2025-11-01T18:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwqDZPwS0sJhzustSl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwfVIgjc9RUVbtK2Yx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwBkZ0RB2dzvKO0Wc54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyRx8kIRspv6bsRE4J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDgzcIUZXZuAzgHSR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzYdePcFg5OXhfaaV14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz8uz5IUjT5JCw33wF4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy3sz23nrUfIxdlCFJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwnojViKzl0G8CMj794AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyH4hSorqWq8zxU7AN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]