Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Glad to see the majority of comments being realistic. AI isn’t going to benefit …
ytc_UgxhZND7l…
G
If yiu think that’s real looking you need to get out more reminds me of those ma…
ytc_UgyPhXiQ1…
G
Shouldn't AI be smart enough to get that not all humans are making the world a h…
ytc_UgxPG3rTY…
G
45:57 It might be good for are all the lonely people who get catfished and scamm…
ytc_UgzWlJkTm…
G
If people are going to use AI for evil, then so be it. It shouldn't be withheld …
ytc_Ugw9mYyfF…
G
I seen a video we’re one Asian woman was able to open another Asian woman’s phon…
ytc_Ugyzt8pQN…
G
The vast majority of all AI is made using stolen content. Not just AI art or "cr…
ytc_UgwEizGkT…
G
No, a GPT with a DAN promot is guessing the next word repeatedly to generate wha…
rdc_jcl6b59
Comment
We gotta be realistic here, Neil can't comprehend the possibility of society turning into a full-scale consumerist society, considering AI will replace his job too! We will consume knowledge, wants, needs, probably beyond our imagination. Is that good or bad? We gotta look at individual cases per person, we will need to regulate ourselves in a world where, let's be real, nobody will be NEEDED for society to grow in technology and efficiency and self-sustainability. We will become the zoo animals around an ever-growing city, free to live among this creation.
Unless people don't learn that capitalism and this constant race to something is going to crash and burn society, AS IT HAPPENED MANY TIMES.
So, I will absolutely be in favor of an AGI led society, as a person who trusts science over my personal emotions.
youtube
AI Moral Status
2025-07-24T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzk90dzqrw2zGBYFbN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzTai3twqZhvAQvrx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxvSb0iSx_lF2T73s54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy77ADopFHYUlFEWLp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxrfBnHDGAGpxl2WWx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJzUJ1OBKl0SVEYBh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwsxLxO5ks7xWc24QB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwzt812ZWTrATooYLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPxtp8ylY5EryfKi54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzhBJ8fzNntdzJebth4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]