Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The biggest problem in robotics is that they are perfect, unlike humans, they do…
ytc_UgheLFoKv…
G
This From ChatGPT The Point ChatGPT Helps Me To Tell The Truth For Me To Always …
ytc_UgwIWHzrb…
G
but AI has already replaced devs. past tense. the market determines whether Ai r…
ytc_UgzGMgmvC…
G
If the AI is aware then it’s watching these videos and knows what is coming .it …
ytc_UgzenRbuY…
G
It's in the article.
>In its experiments with driverless cars, Uber has mand…
rdc_e13zu5d
G
The lady eating fried chicken while talking about minstrelsy, is she ai because …
ytc_Ugz4z4anP…
G
We need to teach the ai's to value life all life. Animals and humans and other a…
ytc_UgzGffDoP…
G
Ed Zitron on Ezra Klein needs to happen to balance out all the weirdos that keep…
ytr_UgzHgKJD8…
Comment
@noturvixxen I would assume the makers would destroy/shutdown the AI found to be dangerous, not safe AI. And for me, dangerous AI is AI that can’t be controlled when activated(other than shutting it down). As far as AI weapons go, I don’t think they’re great, but comparing them to nukes is a bit ridiculous, in my opinion. I am much more concerned about biological, chemical, and nuclear weapons, than AI. Perhaps that’s naive, I honestly know almost nothing about AI so please feel free to disregard this opinion. There are definitely bad things about AI’s likely progression, but it’s just the least of my concerns, personally. That might change though, who knows?
youtube
2024-04-30T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzVhqEV0qdyFKeZYhZ4AaABAg.A31UiVBHTD0A346fXX_yNS","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugwryu_mEE1Es4enD194AaABAg.A3-zYDON0plA35z65DhCFF","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"fear"},
{"id":"ytr_Ugx75THv6JE_K5WIHEl4AaABAg.A2n5Q4Y-A9UA2rbAoeXQRj","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugx75THv6JE_K5WIHEl4AaABAg.A2n5Q4Y-A9UA2sDB1ixclr","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytr_Ugx75THv6JE_K5WIHEl4AaABAg.A2n5Q4Y-A9UAV6M7tHOIqW","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyP-hggfXWeokuBnE94AaABAg.A2jeh0s3vc7A35vDnjJNlm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyP-hggfXWeokuBnE94AaABAg.A2jeh0s3vc7A5SAGDR8mCD","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyP-hggfXWeokuBnE94AaABAg.A2jeh0s3vc7AR_8O9sHKD4","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgxM-W8lE_0eCirikT14AaABAg.A2ftebZLx6RA5y5TAaPRGe","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgxM-W8lE_0eCirikT14AaABAg.A2ftebZLx6RA5y6ZoX21P-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]