Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The idea that we are 5-10 years away from potential A.I. apocalypse is a bit hyp…
ytc_Ugywgj9vq…
G
The guy is wrong! Even the best AI would have fraction of knowledge of a single …
ytc_Ugwm84TZ-…
G
Seriously. I've never been to a developing country with less crime, less trash, …
rdc_dsbodxe
G
Walmart has been sharing their face recognition capabilities with law enforcemen…
ytc_UgxW9sTUU…
G
Will I actually be able to speak to AI when I call government departments, I mig…
ytc_UgxrY0Bbz…
G
I honestly think the AI wouldn't be so hated if they didn't try to use it to cut…
ytc_UgzfqUPof…
G
Nothing new here Bros, ai is a trained Large Language Model. The key is in the t…
ytc_UgzTkSzgF…
G
Unfortunately, pandora's box has already been opened. If companies, organization…
ytc_UgyvcTAiG…
Comment
Destroy them all an those who create such things . We human’s are at the top of the food chain an then your going to make something that will become better then us so we will no longer be at the top hunting we will be running like the prey . Laugh is u want but you are not understanding what they will become an once they fix each other an themselves well they don’t need to drink or eat an they will kill just because. Sometimes you should just be happy with knowing u can create something but be more intelligent to know it can an will be humans downfall. Kill all ai an all who work on it sorry but your to smart to not stop I don’t no why u must but u are not as important as mankind
youtube
AI Moral Status
2023-09-21T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzMP4sAv3GgR84qx954AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz4Px6SMximibA42et4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxzkLjZAh7r7JCAuH94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRiqH0Q25rCmiK3mx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxD_N90g3XFJdDNpz54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZDAk2k8YUHHqHTFh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6yPX_9_-9xQGvGuJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUm2tpWSWQ415EwUt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzC-BmICYBwGZp1rIl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyyM9X15hO1eNtDX5B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]