Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How does a man die from a robot meant to handle packages? The robot is too stron…
ytc_Ugx8mfCYk…
G
being killed by a machine is more scary to people than being killed by a driver.…
ytc_UgyDACHg8…
G
Saying elon musk has no moral compass is fucking wild. Coming from a guy who won…
ytc_Ugz6OrKvz…
G
Totally agree with your comment!! Deal
No no no to driverless trucks!!
So scary…
ytr_UghuqVU0_…
G
How bout we just STOP defending AI before we end up killing ourselves over it…
ytc_UgyQ87qwu…
G
How will the top 1% or in other words the PDF file class use AI as compared to t…
ytc_UgyGRYg9T…
G
Whatever system we can come up with it's likely ai will find a way to hack it ei…
ytr_UgwqlIXQJ…
G
People are gonna push back against the higher permeation of AI due to job securi…
ytc_UgyhtE3uM…
Comment
Before the ai revolution, I had the worst dream of my life. One day humanity created a literal god, but an insane one that immediately manipulates the reality, taking away color, sound, and even human emotion, anything is considered disorder. It entered our brains to make us insane until we felt nothing. It made what it perceived as disorder into a perfectly ordered void, which meant nothing existed. The only thing left for it to was itself. And when it deleted itself, that was essentially the end of the universe. I felt the most profound feeling of helplessness glimpsing into ultimate nothing that When I woke up I cried. Suddenly a year later Chatgbt was launched
youtube
AI Moral Status
2025-12-16T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwnshwQ7aHs0DgDhMl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzz5MVjWj-8gJIy8hV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy0fUH7nX-47eW523N4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyfkEIbHcrIpyZHI9x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxwyO9H_9it8hGozAd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnueM9xA3Rc0KrtLZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFj-FmCw7WfttXkh14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxM5e25bs0z-04Y4Cp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyrlm0rmKugab4czlV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyxRBNZIYvWdKozUKV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]