Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Using AI and calling yourself an artist is like taking steroids and calling your…
ytc_UgyPuyJh8…
G
OpenAI uses Whisper to transcribe YouTube data for GPT-4:
a$$word mentioned in d…
ytc_UgzX8S8Q5…
G
@stektirade you'll never know if it has any autonomy or not, what if it's alread…
ytr_UgybzcPms…
G
Brother: give me your snacks or I'm telling mom what I saw in your character ai.…
ytc_UgzW2rjGV…
G
How could the AI built by human, and trained by humanity's dataset, think out-of…
ytc_Ugx0TM3WZ…
G
@LoVeLoVe-bi2rq no you dunce it's because I know when a sentence is AI generated…
ytr_UgxIAuPkz…
G
I don’t attend any interview where the hiring manager or a person from the compa…
rdc_n6s08tq
G
Self driving cars will be fine for cities in nice climates.
Not for rural area…
rdc_d1ld4lt
Comment
One of the primary reasons why I don't use AI software/models is precisely the pointless environmental damage that Sasha Luccioni highlighted in this talk.I just won't risk adding to it.
I also like that the artist association mentioned in the talk has developed software to track what images an AI image-compiler was trained on. That's a very needed thing. Well done.
youtube
AI Responsibility
2024-04-08T17:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxXn_kY2d7kqZubVll4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzvJT3vrHttOmz_arJ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzfAhS5tgssCm36tYx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxatmtSCsHyNBf8ac54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz-4lAkl7avXSX-_iB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzL8EJRwy-sNQEQwIJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy85qIhCHfB9_u8etV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyi8N_Vb4Kt1JXZFMt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwOJul1_m4VxbnlICd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxgtZnpF3ctHDVabyd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]