Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@rickymcgruder4868 its not even close to perfect like a human’s individually. we…
ytr_UgzTwfP3H…
G
man what the fuck???
Ai is already a huge problem, its already ruined many creat…
ytc_UgynOZgAX…
G
This really comes across as a dumb human vs a smart robot
he has no ability to …
ytc_Ugx1XM-nj…
G
I honestly think using ai when you are trying to learn how to do something is ok…
ytc_UgzE3DIfu…
G
We are not evolutionary wired. The cloud is just a giant data collection grid a…
ytc_Ugy1Nqm4Y…
G
@spadesofpaintstudios1719 we should be friends and treat AI equally. Teach and g…
ytr_UgxGcQZN5…
G
> Big tech tend to ignore regulations and just opt to pay fines as a cost of …
rdc_ofew48p
G
I dont they they understand digital art is just another medium for drawing. (Lik…
ytc_UgzapJbGa…
Comment
This videos creator is forgetting that we’re placing a ton of focus on a little something called ALIGNMENT. These aren’t random AIs, being born out of the ether, with unpredictable wants and needs. WE are telling the AI what it wants and needs. Plus we are curating the information that it trains on.
All in all, I think we do a lot of anthropogenic projecting onto the silicon programs. We both share intelligence…..and nothing beyond that.
youtube
AI Moral Status
2023-08-23T21:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxoTkFV7mKuLIehScZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3YkHe2YNAHWJLFMp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx6gv8SMlCOEaUcBXl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxXE2hwNvHlhbgsWF94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXdWGTRCs7bhE3jJJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzmQ5Z-3PIKBkF2r-V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxCJ8Cqi_a083AxwNN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz07IhAiyzCAfcJDft4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugw91IMPnqeijeddxLB4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwmMmCNvDeYE3s-jll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]