Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need more people like Stuart Russel! This is is the best discussion of the ri…
ytc_UgznoDk0N…
G
.. WHY does Google fire anyone who brings up concerns around (AI) ethics? Becaus…
ytc_UgypDFY_E…
G
The danger is NOT how advanced A.I. will become, the danger is how much trust pe…
ytc_UgzAteiTt…
G
He made the ai do it
He said 9+10
She said 19
He said no it's 21
Then he repeate…
ytr_UgwsXaMmE…
G
Many people say clip 1 is ai.But clip2 is ai.Because clip 2 is not natural…
ytc_UgyIKJBcc…
G
I was catfishing gigachad asking if he had “sum girls to hook up with, bro?” 😎😎 …
ytc_UgzcvJOxH…
G
The people feeding your art to a Lora bc you said you don't like ai remind me of…
ytc_UgyMyJC59…
G
Ai: can i tell you something
Me: sure
Ai: its personal....can i still tell you
M…
ytc_UgzqFj2YT…
Comment
Robot life differs a lot in human life. They can be repaired and upgraded easily and thus can pose serious threat. I seriously doubt will allow for development of advanced AI with strong synthetic muscular or piston driven robots or system. Bigger chances are that in the near future people will be genetically and externally modified with various special suits.
There's also another thing that we never put into consideration, robots just like us need source of power. They most likely will need much more energy than we would do and that just might create a war for resources. Bottom line is, don't create something you can't control and if you must create it make sure you also make a weapon that would easily destroy it.
youtube
AI Moral Status
2017-03-08T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgihQiqZZ6JUtngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggCdskvXvNx-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjSLngEyU8yhngCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugj4vS6AR6pp2HgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugh_lFikQJi-dHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghiE6mj80ENY3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjLYJhHPMsUEHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9uEuu-2tWY3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UghDOVqB_cYCqXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UggEj0A2BFEXxXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]