Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ai artists should always state they are using ai and copyrights shouldn't be pro…
ytc_Ugz2yVqyS…
G
Computers cannot be aware like us humans. No physics can explain consciousness. …
ytc_UgyFJY3ig…
G
Kinda dim-witted to challenge a robot...how can you hurt or knock it out.Youre g…
ytc_UgwdfqT7D…
G
LMFAO. I knew ChatGPT was a failure. I'm retired programmer and data analyst. Ch…
ytc_UgxA3IxbP…
G
Still waiting for AI to tell a basic history story without f i ng it up in the f…
ytc_Ugwa7kCZR…
G
This video really bugs me because if something happened in front of the self-dri…
ytc_UgiesN3Zk…
G
Yeah no shit dude. I work at a university and some of the students literally don…
ytc_UgwYyHsBk…
G
Paul, the Apple paper about the illusion of thinking has been debunked by Dan He…
ytc_UgyDNPUDG…
Comment
The laws
The Three Laws, presented to be from the fictional "Handbook of Robotics, 56th Edition, 2058 A.D.", are:[1]
First Law
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
Third Law
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Put that into every AI as the first and most important laws.
The Three Laws of Robotics ( Asimov's Laws) are a set of rules devised by science fiction author Isaac Asimov, which were to be followed by robots in several of his stories. The rules were introduced in his 1942 short story "Runaround" an "I, Robot".
AI is not a puppet nor a toy. This technology is dangerous.
youtube
AI Governance
2023-07-08T14:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw0QSZYt-JgbZuab8B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugymkdh7E_gns9FGO914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw6m68ixkbMvKVZeDJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw19jjxcJfTrRF37bh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxosLXJBbXADa6EkmR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwJD-z4GLRMMEcZslh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxTc506MdwK2KbA5hV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"mixed"},
{"id":"ytc_UgxD1912uQ0PodsiVs94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzL_Eol4j3onkRYMdF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz0bYYTiBUC_6Svk6Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"unclear"}
]