Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
this is why AI is going to have to kill us though if it ever becomes self aware…
ytc_UgzgnPrNG…
G
AI’s overrated? bruh my toaster’s still confused 😅🔥 I talk tech in a fun way on …
ytc_UgxKEhIAd…
G
you dont have to live on a reservation to not want the data centers i am a white…
ytc_Ugx6nVGn5…
G
No, thank you. I don't want to ride over bridges in this thing. Can Waymo keep…
ytc_UgyCSDj8X…
G
Big Tech trying to figure out what consumers like:
"How do you do, fellow _compu…
ytc_UgwKeKk4h…
G
El mal siempre se disfraza de bueno...... La IA va a ser del mundo 🌎 alertas est…
ytc_UgzOKTKat…
G
Thank you for your comment! Sophia, the AI robot, is programmed to continuously …
ytr_UgymT5AtE…
G
The next big invention will be a weapon that can take out the robots. Sounds far…
ytc_UgyoOtRcE…
Comment
@A_Sad_Duck yea but i don't mean the dataset, i mean when actually asking the ai for help for something, couldn't you not include any racial information so there can't be any bias?
youtube
AI Bias
2023-10-19T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugw4t2eO7w9HB27qVQ54AaABAg.9w4fjuZERWF9w51TekQbMC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgwPyRKu5znMev8W4n54AaABAg.9w43VrXO8gm9w5Q5oxlSye","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwPyRKu5znMev8W4n54AaABAg.9w43VrXO8gm9w5RhuBpqD0","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxjua6pvvgMFg5gSbR4AaABAg.9w3oAS94Ujn9w3pPgNY3DA","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugy3h9NsCvRB8GJxAvl4AaABAg.9w-qDBnEuY09w-zrnXxfTo","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyA1TiTHPS5lKWmzhd4AaABAg.9vzuiYZlpD-9wL3dADG-_o","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgyREmHbYOUSj2B9mkF4AaABAg.9vzRernHKSn9vzUWwfL8ZP","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgzmKaiTRxpfWG7FCBl4AaABAg.9vz-kZDC_-l9vzV31pLfLY","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxkXPZay6AmzXGnLid4AaABAg.9vuxumqco0E9w5PpxIZ7BQ","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxkXPZay6AmzXGnLid4AaABAg.9vuxumqco0E9w8GUmrNnYL","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]