Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is called artificial intelligence, not comparable human intelligence. A two …
ytc_UgwXtI5DT…
G
Racism isn’t real! All these AI creators just make endless slop to target the bl…
ytr_UgyEISJ3K…
G
As an aircraft pilot that uses an "Autopilot", I still have to be ready to take …
ytc_UgwA-cuAl…
G
In 42:20, Pope clings to the idea of the AI being a lot like a lookup table. Thi…
ytc_UgyM84xey…
G
@greghelton4668 Waymo and Tesla are not running similar systems - that's the cor…
ytr_UgxtQY-2R…
G
Lovely news, explained easy.
View of an EU citizen, the future de-nationalised E…
ytc_UgxqVnRMA…
G
Really informative podcast. I think he's underestimating the impact of AI though…
ytc_Ugx_TPmWu…
G
Its like the Amazon physical store was just a bunch on Indians using webcams wat…
ytc_Ugy-vOkK4…
Comment
Didn't see that one coming? ELIZA, written in 1964, "an early natural language processing computer program", contained just 420 lines of code. It worked by repeating what was prompted: "I feel tired." Response "Why do you feel tired?" or "How do you feel about being tired?", etc. The researchers' secretary started asking for alone time with the terminal so she could discuss her personal problems with it. There are ways of fixing the dangers of current and future AI but capitalism will never accept them; just too much money to be made.
youtube
AI Moral Status
2025-06-04T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxyftFdJiG-Wtb-Uyl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhyXYdZmIkyA4n3kR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugyi7aotmTeW0hGbjFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwaentiQjN-zkwW6nZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy8E7LoqMKAlvsv9a94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxp2O6OE7eg5EOQ5nV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugw54apVsj0EYfyaVXl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzhrihmzEGQ56AbH4d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyl3AIaLNpFZhAgKcl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzSo0aENwcAMC3AMg14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]