Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI schools aren’t just a trend; they’re a complete shift in how learning can hap…
ytc_UgxmhcvBn…
G
@mrrooster4876 it is long loll
where I'm at is consciousness or being sentient …
ytr_UgxrcQFPg…
G
Programmer here, who has done a good bit of experimentation with LLMs ("AI") for…
ytc_Ugw036Nx0…
G
Idk thats already kinda what fox news was. And let's just be real for a second h…
ytc_UgwVh35bz…
G
vă mănincă pielea rău de tot, într-o zi o să vă scarpine Dumnezeu,atunci să vă v…
ytc_UgwYWOJpB…
G
It’s actually not illegal. The article said specifically that there’s no clear b…
rdc_k7kzf3m
G
Hinton says he's "a materialist, through and through." This is significant, as i…
ytc_UgyNpx3HW…
G
Although I do think ai art and ai in general is the future. I can admit that tha…
ytc_UgyNYm5Ie…
Comment
The robotic laws must be embedded deep into programming the issac isomov robotic laws t he Three Laws, presented to be from the fictional "Handbook of Robotics, 56th Edition, 2058 A.D.", are:[1]
First Law
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law
A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
Third Law
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Governance
2023-07-07T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw5sDo63gW6Yv5Cy8N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxMtjEkXddaRf_AY_t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugxp2ToDwzWnJgn3rlh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_X2oLuUWgS574vQZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzal945MQpjRpHO1xV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYHMBnGWS5d34WoKN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw32eWr6CLmIVufUD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzZdat7GrtsGDVQenB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1Dw0O3viJ8TDbWMV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw87U76ibO8mRZbTXR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]