Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Funding UBI can come from multiple sources, like progressive taxes, value-added …
ytr_UgwDd6Iri…
G
Sam I am with you, the hours and hours of time and emotions your put it to the a…
ytc_UgwPZVfUF…
G
@Murvion001 AI can discover things about nature, that's literally what an organi…
ytr_Ugy15DkCn…
G
Is it wrong to be trying to get inspo from ai such as posing, texture, backgroun…
ytc_Ugxv_sjAM…
G
Yesterday Google AI answered my biology question by giving medical advice. -I…
ytc_Ugww5vPxN…
G
He sounds like the AI he’s afraid of: extremely intelligent, yet completely off …
ytc_UgzwA_6bV…
G
Haha, that would be an interesting request! Sophia's focus on wisdom and learnin…
ytr_Ugwb-oNfY…
G
Stop giving us hope twin, we know that everyone wil shift focus to building AI r…
ytc_UgygnaH0p…
Comment
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Isaac Asimov, 1942
that fucker had it figured out back thenand were struggling to get it right
youtube
2015-07-30T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugi0NsTNEzCVi3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghKW3U90dEZzngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UggPxLAiNL91S3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UggukRYKoZHNNngCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugj16MbwYPEvoXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggenJTFfEzdeHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgigfobkZ5wmnHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi_4nSGJeFjvngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugg1ZPolIExGAXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgizJ7I8moWOHXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]