Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey there! It's fascinating to see Sofia with a physical form now. The advanceme…
ytr_Ugx8OMGcd…
G
Sorry but AI told that people like you will be fooooked by 2035-2040. All this A…
ytc_Ugxdqw-ZV…
G
Google, right now, "are tesla self driving cars safer than humans" and top resul…
ytc_UgwcfQchZ…
G
Don't let these people continue, create universal oversight. The clown can't eve…
ytc_Ugx2kRpuh…
G
I don’t know a lot nor am I into AI art, but seeing my little cousin making his …
ytc_Ugy9KDXTt…
G
A prime example of why there shouldn’t be self driving cars. Another hypothetica…
ytc_Ugw4n0VVi…
G
+David Webb According to records, by the end of last year self-driving cars trav…
ytr_Ugw9BwhkQ…
G
Why are you so MEAN to them. They are just ROBOT'S!!! HAHA or are they????…
ytc_Ugxvs_k-F…
Comment
I would def say people need to disconnect from technology and connect with nature.. organics is where its at not neuro-link not meta.
This guy presenting seems rather disconnected from the standard mental & physical requirements for the general wellbeing of humans.. if he is programming these robot based on his own perception and ability to critically comprehend the world and what Humans require in the robots?
In my opinion he has lost the MO which is to benefit Humanity - This will not benefit Humanity.
"AI algorithms" affectively influence people into behaving in a way that is unnatural but forced by corporate agenda's at the end of the day who ever is at the top will be creating the programs for algorithms that will be used to influences the masses no body will have free thinking.
This is dangerous
youtube
AI Moral Status
2022-02-14T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwZYUqTOBrKj6JRR714AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy24ID4M7ZHZKZu-6h4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzOYmv_PhVhzfwiH_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxvWcPPz473hBDRAvt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxYXEsMyjL-mMWWQLR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxwae1ZEwFc_pKLphx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyk0Sazr1hTx4FxzCp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwnBtDWtNcn8TgbBPV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjsZmX_koQTJ_yMq54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxnpucD4hPzvDDWF254AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}
]