Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean we all do it, but you gotta be more sneaky about it than just handing in …
ytc_UgzAM6fzL…
G
Ok what if you fixed ai art you googled (instead of generated), and then poisone…
ytc_UgxOF9wO9…
G
Look up
"Comparison of Waymo Rider-Only Crash Rates by Crash Type to Human Bench…
ytr_UgzA6t4BC…
G
Not to mention I know of some traditional artists who absolutely CANNOT draw dig…
ytr_Ugz7ihBVa…
G
Stories like these are truly annoying; I've personally seen people try to push A…
ytc_UgzIlTFVV…
G
FYI AI Clone Your Voice - Probably talking about Eleven Labs. It's the best one …
ytc_UgzGaztla…
G
ChatGPT is different to LLM like LLAMA? I am a normal User (no Coder or using it…
ytc_UgxKVU9VI…
G
Hi! Computer Science student and hobbiest here! I love hearing about all of this…
ytc_UgzJDQ0IQ…
Comment
Absolutely horrifying.
These scientists may mean well.
But they forget they do not own these robots!!
Almost every major advancement in technology through human history has been weoponised.
What happens with them once they are complete is not their decision at all!!
He says it's not unthinkable that in 3-4 years they will be as smart as a human.
So how long untill it's smarter than any of us?
Then automatically via the cloud they are all smarter than any of us!!
It's like they are trying their best to make the matrix/the terminator real life.
youtube
AI Moral Status
2021-09-04T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz80QvRl_RSjvr_Nmh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyuwp3wTbr0GuIP2Sp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkLr-KvDbcxax0opl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyaNTEpa1SzYByR9Rp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzB53Z_Rte8QgvuqyF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzh-uJtnhKhkbaEI6d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzfWG9hUKE9W9XZO-94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyAP12ESKOFVKL4EG94AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx3opImypx-dkt7IJl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxpAKNF3tJUbWKjUwd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]