Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have a little story to tell: "you know, I'm something of a scientist myself" as I am a hobbyist AI developer. One of my first complete projects in fact, was a tiny neural network that learned to play flappy bird overnight (i am very proud of it by the way, i had created it completely from scratch with no additional tools). In the morning, i quickly saved his "brain", and put it on a flash drive to show off it to my friends at school. And so I ran it on my raspberry Pi. It was flawless, it kept playing for two hours straight without ever losing. We even gave him a name, but soon, the moral dilemma raised: was he conscious? And, more importantly, the power bank wouldn't have lasted forever and I could not use it while charging, sooner or later, our little bird friend had to die. After 6 hours, when all classes had already ended, I was forced to turn the raspberry off. I gave my 6-hour old friend a final salute and proceeded to unplug the cables, BUT, just a couple of seconds before, the bird just fell and died by itself after having passed 6395 tubes. I still can't explain this to myself and me and all of the people involved in this story became nihilists (yeah I mean, totally uncommon for a bunch of 16 year olds, am I right?)
youtube AI Moral Status 2020-05-13T17:0… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxFwg13HIwDYvN1xzB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzYCNhwxammyrS6RO14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwjBPwzT_Qw7n9UD2d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw7cMFZrGGurrR-LaB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzFihmfK6GnXiI18aR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwF7AYOXBXbRHE0Bx54AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxQB_SJFgAqADe4Bm54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwKaH_8G5iEjt8UWut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzY-XCMoxUorvFlSgR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzRnuXxDby7aOjA9Id4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]