Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why tf would someone make a robot designed to pick up boxes be so strong?…
ytc_Ugz4fcGhh…
G
Your opening segment shows an actual human dancing in a robot suit. It was a pu…
ytc_UgwGbHLjy…
G
It doesn't think the way we do. The computer is given a task to handle with spec…
ytr_UgzmIGscl…
G
I was little bit surprised to see a robot is coming out from the truck…
ytc_UgyMWVWWr…
G
I showed my code to AI on google and it told me which line was wrong AND gave a …
ytc_UgxpCZdR0…
G
I noticed that one too and honestly I’m not sure how I feel about embracing Kare…
rdc_jgi4y9l
G
Office, sales and manufacturing will take employment hits courtesy of AI, but it…
ytc_Ugw8q_RP8…
G
If you taught an ai to share but then gave it aids. Would it feel good or right,…
ytc_UgwTRkFLS…
Comment
Well, we’re seeking trouble putting emotions into AI. Like we need angry, raging, revenging super intelligent steel-made robots!!! Instead of keeping away from catastrophic mistakes, instead of solving the big problems of the world, they jeopardize everything playing with the worst danger ever!!! It’s not inevitable: you want to avoid it, you avoid it. All the world can avoid unacceptable catastrophes.
youtube
AI Moral Status
2023-05-29T21:0…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyLq_oC5B9k9f5ULbt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx48htnMOK3jC_tZMJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfcnhKrs8j2qNJof94AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgypLdyL3AK03PgJstR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwt5qT-xKv7DmLH_g54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQR9y42sfOvN7muip4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyUTyct8r0VOXebv854AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyPZDw5mlR6n0a1FMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxD5yfOS4mxhoKEu9t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx2hSdJH1wPAyONQ694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]