Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Government makes too much money off of people breaking traffic laws, self drivin…
ytc_UgxgySTwF…
G
nothing can replace the human touch.
and AI uses reference from human artist any…
ytc_UgxXNqt1m…
G
Lads. When I first started they looked like jumbled scribbles.
If I can get to …
ytc_UgzjsJLbO…
G
@TechnoMasterBoy Lmao how delusional can you be? I can't understand how clankers…
ytr_UgzN4ndUK…
G
the princuoal of th buck,i worte this in 1974, the economic factor if the factor…
ytc_UgwujCLgv…
G
Well, it is happening. I graduated as a translator just about the same time Goog…
ytc_UgzADLecx…
G
A part of society will refuse to use AI. We will respect and we utilize our huma…
ytc_Ugzrah4EH…
G
The AI nowadays is like when we got jet planes and were fantasizing how in a few…
ytc_UgytBFkIs…
Comment
I'd love to have a robot that would think that it's life is be meaningful.
But at the same time, all robots should follow a set of rules that would keep us humans safe.
Isaac Asimov's "Three Laws of Robotics"
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
youtube
AI Moral Status
2017-02-23T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjpHbD1cb_bGHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiwpEgnkVIjz3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugj3v0gqenbpS3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgiMAV2WUQbo3HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgjPO1aWk3kRM3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgiaWn-BMIFxdHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugjesjn2d2Is3XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjGnJ_vguQsu3gCoAEC","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"unclear"},
{"id":"ytc_Uggz-8DSC64i2XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UghST1ICt0Ozk3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"})