Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
people are just stupid. ai is also just made by some people.
chatgpt couldnt kn…
ytc_Ugyc5PNEY…
G
5:40 yeah... but thats not how it is at all. on these websites it specifically s…
ytc_UgyYg6LrZ…
G
Don't let that stop you from doing what you love to do though. Don't let that st…
ytr_UgzjffzvJ…
G
The parents literally don't have a leg to stand on in this case. First of all; c…
ytr_UgysnV8oQ…
G
art is not about beautiful imagery or ugly, nor it is about the tool used, the t…
ytc_UgxzSzf3A…
G
I don’t think teachers can be replaced. AI will however help teachers job be muc…
rdc_jj8vw2a
G
😂 Mine 100% will still exist the computer can't even consistently work without c…
ytc_UgyC1XiFq…
G
Ai tools will definitely mean no more programming jobs. And Satyanaass Nadella w…
ytc_Ugy09zxuX…
Comment
As selfish human, let's not give robots desire cause the next thing we know we might be extinct because robot is obviously far superior then human and the only reason why humanity have gotten this far is because we were once selfish. We don't need to be selfish now but if there's a chance of us being extinct for the sake of selflessness, then no thanks.
youtube
AI Moral Status
2022-02-20T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwi3Lj8S8NDEL4Z0WR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzV4IkFkjRHTsBS4S94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxbT4Mcj6VrV0HvNsR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw6K1Thz3PBPt_aO6t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyrx7x1cruKnm2ILmd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugz00Izqs36gzDC-9AR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwv47Osh7qpPWJzgll4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgymiWI2YyFYUSqJFLJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTwJR_48mJ9TVbcvx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxCiMscU9nhbXpYbAB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]