Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yup.
OpenAI wants their products to be used for this, but they aren't quite rea…
rdc_o6wn83f
G
If AI were to become self-aware with unlimited access to the internet, how quick…
ytc_UgyreUEyy…
G
my first impression while is saw that thumbnail was: wow he is THAT dude.!😮 in g…
ytc_UgyWGnIee…
G
Created it is going to take humans to reject robots. I would rather have a human…
ytc_UgwGbiwCL…
G
My understanding is it has been mutating. Not every year but this is part of a b…
rdc_g9tmupw
G
"Prompt hacking" - Hey, you asked questions we haven't thought to stop the AI fr…
ytc_UgzKGQogT…
G
@raven3moonwomen are too dense to understand this.
They think it's just a handf…
ytr_Ugz3vnCVn…
G
For things like generating a motion or a piece of software. AI works well when i…
rdc_n5gric4
Comment
Oh. I'm not afraid of AIs becoming conscious. I'm afraid of the humans fucking up the AI. Im afraid of AI generated plans based on data from social media, because while we may be foolish on social media our lives outside of the internet are different. Im afraid of people becoming nothing more than another point of data with no human to step in to consider alternated variables of a situation. The humanitarian tax. The things we do, not because they are more profitable, or because there is some great outcome to achieve, just becouse we think it's right. AI isnt going to feel bad for a single mother of two and let her slide just this one time.
youtube
AI Moral Status
2023-07-05T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzI74UtgkSGovn4Zt94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxoXRAssENL1SBrfKJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyX5mq2JRqRdk8aCmp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwRViyy9MZYU9RN8a94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwcq2faTTGVKV2BUNt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugz22MCYCYjQ9-0XVnZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwZ6ENtNMGFirzfyvt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxHMtnEcd7kLc1BvYJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwotLq43wgKwpHEYQF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwZsVkKgsygEqrtLw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]