Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Less concerned about the ai, but the percentages who agree with you underneath i…
ytc_UgyG3ViyZ…
G
Nobody cares! Let’s be real 99% of people rather speak to A.i then any call cent…
ytc_UgymMgZyI…
G
They were real photos. AI is a copout. We've all been young men, we all know the…
ytc_Ugy4nJx0F…
G
Yes, Tesla autopilot doesn't reduce speed based on the road conditions most of t…
ytc_Ugy8AeQ4w…
G
What's crazy is that all those smart people at WAYMO never considered School Bus…
ytc_UgzxwGUmg…
G
I'm pretty sure coding has an AI issue too where people only use AI to make code…
ytc_UgzfOlM6A…
G
It all depends on how you asked the question.
I asked the question differently…
ytc_UgwPnIh3u…
G
I have a tolerance for ai, when it is used correctly and transparently
Want to …
ytc_UgwzdRDsj…
Comment
It's interesting when he said people in silicon valley aren't happy with what they are doing, regarding ai.
That's it's a race to be the winner, without considering the consequences.
Scientists at the Manhattan project understood the consequences, some even cried.
Yet over 2,000 atomic tests have been carried out worldwide ( this doesn't include actual warfair, dirty bombs or nuclear power disasters)
With civilians ask why do so many people have cancer?
Whether it's from gathering scientist in operation paper clip or becoming the global leader in soft drinks, the list of mind boggling stupidity to be "The first/ best/ biggest", shows human behaviour which cannot be fathomed!!
youtube
AI Moral Status
2025-12-21T17:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxThS4ajTzdmbPhgd54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzwLfNfzsKT_cI5DrN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwaItbAkUtzbepUd554AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwq5g8rcvOi4hrbOXJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzCNCt-ksMFts7oPRR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugwjc46jO8ndMCXFn9d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHz2BD-bcTNtTpKr14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyv21qBbsdzbbhpW6d4AaABAg","responsibility":"company","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNQQSE9i7l7JrZeKp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzAw-O5aJKft83lsAV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]