Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This isn't true. Capitalism does not require products to become cheaper. If it b…
rdc_l4qfg5a
G
I wud say we have humans that pose more of a "risk to society", AIs are the leas…
ytc_Ugx65urSs…
G
so much bullshiidoss,iif chatgpt say go die you run for death?or kiill other for…
ytc_UgyFND6mx…
G
I'm older.
Tech executives think that what programmers do is write code. What …
ytc_Ugx9bhvSL…
G
People need to use their discernment and realize AI is man-made. It will never b…
ytc_UgyTlqMp2…
G
I'm flabbergasted.
1. "32-hour workweek with no loss in pay" - that is a great …
ytc_UgyiXq_e_…
G
That's because consumers are not their customers. Their customers are corporatio…
rdc_ohzkvtv
G
If a job can be done by a robot, it's not worthy of a human being doing it. 32hr…
ytc_UgxAUh66z…
Comment
Neil De Grasse Tyson once said, "What ever effort it takes to go to mars and terraform it, it would take far less effort to fix Earth." Why hasn't he reached the same conclusion about AI? Whatever money, water, and energy it takes to train AI and reach AGI, it would take less resources to fix the education system and develop smarter and more creative people around the world? People have become obsessed with AI. It's tiring. These AI type videos are getting way too much airtime and views.
youtube
AI Moral Status
2026-03-01T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxMUKhvT2axm6zay-d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwYT7SqIQREwO_nf9J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy55HRusgII_JqtZ9B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwmYA3w_Qs59dN_y654AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzAkfq4ft_jyJa2t54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOl62x8kwtxGngEU54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx7tiWXl895cx2tfgZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugw9WPHHmrgMbtn9UnV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxmDxH5b-VFniKDwFt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx3Q27nZJB4-FBWHrp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"}
]