Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have been using AI for years now and it will never replace good developers. Th…
ytc_UgzPq3nqD…
G
I mean he’s not wrong 😅 I now pretty much use AI for everything. I have bad ADHD…
ytc_Ugw7O8vxm…
G
Thank you for this wonderful opportunity.
1. What do you think is the most amaz…
rdc_de2s1di
G
The problem is that being good at art isnt good enough, you need to exceptional …
ytc_Ugx0KTIRq…
G
I am an artist myself but I do understand people's desire to ate something what …
ytc_UgzV2HOOy…
G
Ai can do anything when it gains sentiment. Put Ai in a robot, and humanity is d…
ytc_UgywNdv7R…
G
@mignotmaxime2409 comme les romains qui subsistaient grâce à l'esclavage ? Dans…
ytr_UgzV9bSpL…
G
AI can give solutions instantly, but where is the practicality in manual labour …
ytc_Ugz1pc-60…
Comment
This fear of A.I is strange considering that it is humans that have programmed A.I and it is us humans who have provided ALL OUR LIVES ONLINE to big tech, and that's how the A.I is LEARNING FROM US. Google, Gmail, YouTube, Hotmail, Microsoft, Twitter, Facebook, Instagram...Nothing scary about A.I, be more worried about the minds/ "directives" it receives from humans; what HUMAN'S want A.I to do to HUMANS. Boston Dynamics is a very good example of that.
youtube
AI Moral Status
2023-03-05T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx5XeRkqY3IrOOlPc94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyvAVCYOY8h1X35jCN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxV2wJVZeStjTUsPdx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyxeovyW_tmnOAKSet4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyrOXFkZfuftiSRyLp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxB3xp7szWTAC2BYtF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxDo1XwF7dHgUH43Zx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwdAqSf6LW5OZPRbhx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxl3GVSCqYlTswaI9R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwOQlpmX3Fkli4YFj54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]