Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1:03 nice to hear that the guy from Stanley Parable has had his voice cloned. No…
ytc_UgxEPal8N…
G
it makes you/me feel how much we fail, but to honestly say, that you or me is ab…
ytr_UgzG7dbDU…
G
If artists are "blue bloods", born with a talent for creating any type of art, t…
ytc_Ugzezp5pE…
G
AI model to do all the illegal shit you want to do. But it’s legal now since it’…
ytc_UgwrsKU_I…
G
The majority or almost absolute majority of people in this simulated world are A…
ytc_UgyC611-S…
G
This is me in a specific discord server.
Half the mfs there have AI pfps.…
ytc_UgxJKpn-d…
G
@SamuelHuckaby-q1kexactly, these people in power don't even understand social m…
ytr_Ugyhxlrrx…
G
Yeah? Than, where are the JOBS at? Nothing like 389 LinkedIn views, 400 applicat…
ytc_UgzIUv4lO…
Comment
I only dislike the concept of developing AI with emotional centres. "True AI" something capable of free thought would most commonly be associated with creativity and human creativity is most often based upon our emotions. The very idea of an engineer sitting in a lab running tests to ensure the AI is capable of experiencing fear, sadness, or pain is a horrible though and one that most often leads to another Skynet.
youtube
2013-08-19T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwyzTTBUQx7ff4FHnh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyye3ZVRQqGb1lrS_J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzejwUsqHpf4IDuYTN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzZfDiafXY4ZzXbrbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwyY_iItOIceQI3FHN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugwx5y8v4Rpqqztli_14AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugy7adHAmjtC1LDPiAF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugws8rVfX9jYUT7B50p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgxWKjhUU9icdBMvYzN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgwIoiNayH1xLNbULCp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}]