Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's why AI can't understand that the English language nor can it speak the En…
ytc_Ugys_bIjy…
G
Yeahhh these are more like sidenotes on why you shouldn't be using ai to replace…
ytc_UgyBxJhxZ…
G
I’m not really sure what the inconsistency is here. Seems like ChatGPT actually …
ytc_Ugw4-93bx…
G
If Amazon could they would probably fire all 1.5m people working for them if AI …
ytc_Ugww_b_bM…
G
The hardest stuff is the easiet to automate. Kind of looks like if the more proc…
ytc_UgzuFS1Uo…
G
There is another problem, with no possible solution: Midjourney's Blend command …
ytc_Ugxw5oF50…
G
@Selion67r You don't think it would look generally more structured and with less…
ytr_UgxchyFSo…
G
The fly in the AI ointment is that if 99% of the population is unemployed , ther…
ytc_UgzeV2THM…
Comment
why can't we not make robots? like really. Making robots means profit for big companies and unemployment for normal people, i do not find a single thing that a robot would be better than any human. And do not start saying they would carry more weight, they would help elderly people and crap like that. Robot = bad thing eventually. No wonder so many big names like Elon Musk warn against Artificial Intelligence. Yes Robots and AI are different things but they are interconnected.
youtube
AI Moral Status
2016-03-24T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UghwyofEIEM1NngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UggpGY1ITooF1ngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghjlPCUYup3S3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ughg7zBCqrbkNngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgjZ2Zqe34cE1ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghB4UeBi1Zx73gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjV9fJRwZ8Jo3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiYsYo1ZIau43gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggqpXJ6nOovQHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgibQKlJl_eU9ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]