Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yup. I been thinking this. There are ZERO confidentiality or HIPAA protections f…
ytc_Ugy6fSOwF…
G
...someone programed the AI on your video's thumbnail to vomit blood after being…
ytc_UgypaCX8j…
G
robot do not have free thinking. robots r dumb. the earth is flat motionless and…
ytc_UgzVCfDDW…
G
First of all, artificial neural network are not AI, Robotics are not AI. Please …
ytc_UgyV4QEKI…
G
Except it's true. China has an IP theft problem, and the best prevention is not …
rdc_e27epfp
G
QA here. I have a junior dev colleague who obviously vibe codes everything. I ha…
ytc_Ugx0JnlVN…
G
Just finished the prologue and first two chapters of her audiobook, very much lo…
ytc_Ugw2Fu3pi…
G
Let's see if Aurora's in-charge can put his money where his mouth is. Have his f…
ytc_UgwuptTBH…
Comment
If you build a robot that’s supposed to help you make the world better and if they have no conscience no sense of feelings or as we say gut feeling or emotion to help the world that means humans are the problem in the world. so if you program something to help make the world better, be prepared to be marked as the enemy of the world. We build some thing that’s supposed to be smarter than us to help us fix our mistakes if something so intelligent is supposed to help fix the mistakes then you will eliminate the mistakes rather than keep patching it with a Band-Aid.
youtube
AI Moral Status
2022-06-06T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz8ZAr9rjbp6h6ObvF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw0YgLAS61JNjAJK5h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxyNEOgkriZGu5d_KJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwg3jaKmM3JGyd22MZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzgQey6rtoqRTc-SE94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwqBXG44UMLkXPC-7R4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyvpvUVenTSKRXl9zJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVF52ogB3TQAtN5294AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzcTBzrkBAZYqO3TDJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwhdgQVvX48IDwmYrF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]