Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The robot was scared for it's life. The man that was shot was holding a glass of…
rdc_f8tg321
G
I don’t like how the armour on the robot looks very similar to the armour on the…
ytc_UgykDuHQG…
G
@johnhuddleston6981 Thank you for your comment! That robot is definitely not to …
ytr_UgzYuDmmY…
G
Because people want to create art to the highest standard without putting in any…
rdc_o5pybid
G
AI art can never replicate real art without any sort of mistakes, or using a uni…
ytc_UgyzHfjiX…
G
In a few years Tesla will be in a league of its own. It will completely out pric…
ytc_UgxMteOJn…
G
@lasermouthfulthe mentally ill person believes the piece of wood is talking to …
ytr_Ugwqzt58J…
G
I was told in an AI workshop at a teacher conference about the possibilities of …
ytc_UgyE3a1Kz…
Comment
Computer development always worked exponentially (Moore's law, 1965 already). People like Hinton knew that when they started developing AI 50 years ago. Knowing this, it wouldn't have needed a genius to predict not only the emergence of a working AI in a few decades, but also that it will become much smarter than humans at least 40 or 30 years ago, and prepare for that. They didn't, because they didn't care, or because they were not smart enough to realize it, or they were bribed to forget about that. Now we have to face our potential extinction, and old people trying to explain themselves and to beg for forgiveness.
The scary thing is, that even it AI generally turns out to be benevolent to us, we all would end up without anything meaningful to do as work. Humans would degenerate within a few generations to some kind of roaming apes, pampered by robots. Phew ... I'm really happy that I'm closing up to 70 right now.
youtube
AI Governance
2025-09-08T15:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzirJYpapHpTI3oSvV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx1XWPHwrei-YQRUO94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKg7LlvRFqH_FECZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzknBUmybOJvk10Rk14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymWHZChAa3x7RJvcp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxuem5FVz9AQ2T8I7l4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzLBJE-SFH9e6-Rgjx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzYsso-mkKK8bEVJMZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjAlGlYWTjChStgbd4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxpeXLQAIr9-NMAFdV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]