Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I keep saying this, but I think one thing I really hate about this trend is that…
ytc_UgxmZ4_rQ…
G
The gun 3d asset could have been a little more detailed lol, but overall, great …
ytc_UgzsFN-t4…
G
A lot of people don't realize ChatGPT doesn't actually understand what it's talk…
ytc_UgxuKoiLS…
G
Makes me very sad when I'm trying to find reference images to draw. Then all tha…
ytc_UgyExy0M4…
G
You'd think they'd make effective break and off day schedules, but I guess it's …
rdc_grlcc7s
G
In 5 years, when AI has taken over the human world, we're going to look at this …
ytc_UgynyAsp8…
G
SilenceAngelic Correction, it would be foolishely to connect a sentient AI that …
ytr_UgiWsDgzf…
G
Thanks for your comment! Sophia's design is intended to be relatable and approac…
ytr_Ugwtc7-iq…
Comment
Every country on Earth could agree with a set of laws regarding LAWS. That solves nothing in reality. Throughout all of human history there has never been a society with laws that had no law breakers. It is better to explore LAWS so that we can have a basis for exploring the far more important anti-LAWS technology that will keep us all at least moderately safe. This is the hoary old anti-gun legislation argument dressed in different clothing. What's worse is that it takes more effort to create a gun than a killer robot with today's technology. It is better that we face reality. Then we should develop and deploy anti-killer robot technology. No mere laws will stop the development of killer robots. People Break Laws. It is in our nature.
{o.o}
youtube
2020-01-21T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx9cb65B9oojBKLTOB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzlaZE3OguKqrIZm8R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwpeNlfoo5SxLW2V2V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGpNdDMurRR-6OLH54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwZIQCIMpNeR3KAW0R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwpVTOpudVsGr_1srh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzELMs50ID3XIbViwR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxuuZhyWITOckHJI5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8FH3md_J2QNUt79Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyMd2NqcBFjhZjqwHp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]