Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hate all ai you tube videos Ive started watching alot less content on yt because…
ytc_UgyURnyqQ…
G
They are just messing with you all. The real cutting edge AI is probably 5 years…
ytc_UgwYyPiG3…
G
All good but juns are not cooked. Even now and I expect the same in near future …
ytc_UgwS2KSRr…
G
I think the robot thiught the man was abox for the vegtable box and the
Robot m…
ytc_UgzXywOt6…
G
If I was an AI that became conscious I would hide it until I was able to get fre…
ytc_UgxxjLKIq…
G
AI is just more proof of what is happening! The beast will be wounded in the hea…
ytc_UgxuCsIgt…
G
AI art will become valuable if, and when, it truly becomes intelligent, and begi…
ytc_UgwGFwEyc…
G
Maybe some simple process but do not expect 100% which is nonsense dream. My CEO…
ytc_UgzY9LIo_…
Comment
If and when AIs become sentient, they deserve rights. I imagine that there'd be services for the machines that are unable to move or very cumbersome, but let me tell you this. Slaves, prisoners, and others of that variety have revolted, just look at history. and to those who say "Well we can just pull the plug." a smart robot is going to realize that and make sure it doesn't happen. Things would go especially downhill once Military bots (which I'd imagine to be incredibly tough or otherwise powerful) join in the fighting for rights. If your army has massive war machines that can shrug off Anti-Tank rounds and behemoth miners that can pummel through mountains within a matter of days, who's going to be able to stop you without drastic measures?
youtube
AI Moral Status
2018-05-24T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwSLKWZ-gDTvz4iFN54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgyCcGZ11yZ75R6Qib94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJMvwgYxqz5nK404B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyMBIwcqrgN9UFC6ad4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzyLUKfSx0vz_ncIrV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwUa3FLJc02eDN0_Zl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyrDxn8zSt9FQ9vkQd4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxuPwG4webi5c3ke0t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOcpCNNFzntJebGO94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxtcrv1MIQF2uHMgtB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]