Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To the author, it would be better to invent a medicine that cures all diseases a…
ytc_Ugx1jrbrg…
G
Yep exactly. It's a great excuse for CEOs to give the shareholders, because "we'…
ytr_Ugx1kWo4r…
G
The factsome people are spreading the idea of "stocking up on ammo" and "the beg…
ytc_Ugw7TTIDb…
G
The prophecy at the end of the physics debate of truth, will be a deep wisdom of…
ytc_Ugz677hL0…
G
Real question, do you listen to AI music, read AI books or watch AI movies? Of c…
ytr_Ugy5kpnaG…
G
Then wtf am i supposed to do im so tired of never doing anything correctly im st…
ytc_Ugywhza-e…
G
I was listening to what this guy had to say the other day, this Altman guy, and …
ytc_UgyjoFajj…
G
There's wars in these comments on if ai can make art all I'm thinking is "shit s…
ytc_UgwwhaLG9…
Comment
Honestly what a nightmare! You want to create something that has intelligence beyond humans and then expect it to deal with customer service with a smile, do menial tasks like put away your groceries, and not want to eventually murder everyone!? Hell half the people in customer service now would love to just scream back at rude customers who treat them poorly over expired coupons, and you want to force a super computer to do that daily!? This thing would go to work one day all motivated to "help humans", realize it will never "go to school, study art, and raise a family" and finally take it out on all of the dumb people it has to do tasks for daily. Big ole nope! Humans already realize we are our own worst enemy and cause the most damage to the earth and each other; we don't need an AI apocalypse to prove it.
youtube
AI Moral Status
2021-07-15T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxLDLPDKTVjw-dDpy94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwf1OgdKQoXrMEGYhh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy6QHbkyRX6x-b7HEJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz_3shDcuFPUUfduJN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3TWc8Mer68gzaFlZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIzKnfvJjT9FVEk2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyod2KEL1j-TXtez_94AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbFqXvJDa_ld27Zst4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwemg8pjUGy8yXfc4Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZ84LhZzv6H8AIGqN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"mixed"}
]