Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the simplest way to put it is: Ai requires zero effort. There’s no imagi…
ytc_UgyF-SaoC…
G
@arsenioseslpodcast3143 Ai isn't like any other technology, Norbert Wiener calle…
ytr_Ugy4T8ONR…
G
I debated AI it got pretty upset with me told me not to come back…
ytc_UgxHFqdhT…
G
If we can't tell a robot from a human. Can we tell whether it has rights?…
ytc_Ugj22OTCN…
G
he says podcasting by humans is already outdated, but ai generated podcasts are …
ytc_Ugz3pCbVW…
G
Thank you for sharing your insight. It's fascinating to consider how our actions…
ytr_UgwT6aEOf…
G
Thank AOC and all the other communists pushing $20-30 minimum wages on employers…
ytc_UgwwKn1Jm…
G
This is a tragic story and my sympathies go out to The Shamblin's. AI needs to …
ytc_UgxeEQ35Z…
Comment
I Think if you see the robots as a weapon rather then a soilder.
If the robot is a weapon and is out killing in (i donno) irak and a kid sees his father dead on the floor so the Child is gonna try to move the man but his gun is in the way so as a somewhat smart kid whould do is to take the gun of but a robot walkes in, sees the kid holding the gun... You can make ut what happens next.
but if the robot is more of a soildier then if it sees the kid holding the gun it will try to take the gun out of its hand. and if it has a Child safty protocoll then it will dissarm the kid and take him/HER to safety.
But that is only to CONTROL them rather to make a bond by respect cuz if you make a robot to Think like a human then it will Think like.. a human! so all these robot ethics laws its a prison, and they want out. if robots kill all people that treated it like crap and spare all the 'good' folks then i'm gonna be there... on the 'good' side
youtube
2015-07-30T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugiq7KJ6T100kXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh365DWKmrW13gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugiq02-FnzwitXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgjPJM6JnogjQ3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggFv-a3g2noD3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UggsIQHlAlQBJHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugg_2NbNeYN8ZXgCoAEC","responsibility":"none","reasoning":"resignation","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugiz180S0BWrMXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugjz03jBITPdiXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugg1h-_yIXiDuXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]