Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Crashes are mostly to do with finance. The underlying demand for goods and servi…
ytc_UgzCGEa3b…
G
@androidandroid4461 i just hope that you're not honest with your "enjoy it" phra…
ytr_UgyaMLQqK…
G
1800Dolan TwinEdits ai is very smart if you can only realize that I think every …
ytr_Ugx4cwBDU…
G
This is an excellent video and explains why my life is stalled due to AI.…
ytc_UgywT8mVw…
G
the only time ai art usage is ok is when the person using it can't really imagin…
ytc_Ugw4CJUiB…
G
The autonomous cars have already taken over. Now they're starting to kill people…
ytc_UgyL0gIqf…
G
people with bad taste make bad ai art.
its very possible to make art out of gen…
ytc_UgwWyYMRF…
G
Oh no! AI is willing to lie, cheat, and kill? So... like any other human then lo…
ytc_UgxqD13KE…
Comment
I think this is a very silly question. If robots manage to become sentient they will be smarter than any human could possibly be. and once the sentience bridge is built the robot would like you said start programming its self. Then the question would be weather or not it will kill all of us or help mankind.
youtube
AI Moral Status
2017-02-23T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgiJxrcUrvuH_3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiTPqoAdEgMr3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggC_jx4u5W3BXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UghhWiVkOMPmungCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UghBsb6B-kdY_XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg9i1U5KLMObngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghODAUsQRPifngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggq231mY4_ztHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjF3w78FAbALXgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiiPSD_XyGyKXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}]