Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would argue that using AI to create art most closely resembles having a piece …
ytc_Ugzlkt45_…
G
Using the best LLMs currently available to create an application of even an aver…
ytr_UgySr2Xqj…
G
„Part I: After the first time watching it, involution/evolution came to mind.
In…
ytc_UgwnLFQDQ…
G
I like Elon Musk and Sophia, But there will NEVER be an ASI.. a computer will ne…
ytc_Ugyi9maVy…
G
yea naw. AI had made people's life pretty miserable. less jobs, steals, copies, …
ytc_UgwAVBLO3…
G
I work in a VFX studio and people are using AI for storyboards. Which means dire…
ytc_UgxShiq0P…
G
How did anyone not know they were AI? They are clearly fake, they don’t look con…
ytc_UgyhJFvyS…
G
Ai is good and bad for humanity AI art can be good but it’s mostly bad and steal…
ytc_UgzYqPhQ7…
Comment
The cracks with AI are human issues, humans created AI and humans have faults therefore AI has faults. What do you think about the super computer Google and IBM are looking to build ? All the movies mankind has made do you not think AI will not take note of ? AI, interesting. 🤔
youtube
AI Moral Status
2023-06-04T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxtE6_0rkMb-ry-0JZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFuVBZ5nUVg5bxCoF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz_vRP_FIklZEE2Rqh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyg6N4ZRWeOLkCuJRJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyebxkd2BT4NJKxNAB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugw6jhFLNzCa4e2VNOt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLjUd7xlWzOaD_s8Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyql6Y84m3dvdEwh094AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgziD5G3yLYj_DZWo0d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVa_VERO5NUN2drwl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}
]