Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean if its really Autopilot and not FSD then it is a completely normal and co…
ytc_UgxSzFMiD…
G
Not watching movies from big production companies anymore, if they're gonna use …
ytc_UgyY-mGFz…
G
Ok , I believe I’m going to get labeled antisemitic here I’m not however . That’…
ytc_UgynrCVWb…
G
why is there an ai generated video summary and the sponsored ad is "fiverr, find…
ytc_UgzL28Ca0…
G
A UBI will replace thousands of individual existing federal, state and local soc…
ytc_Ugx-efLkE…
G
I'm gonna guess that the AI was trained with Squidward and Squilliam's voices, t…
ytr_UgzZOTSGr…
G
One of the main reasons AI can't replace software engineers is that English is n…
ytc_UgzuMcGI7…
G
I had a left side ablation done. My doctor used AI to guide the wire to the left…
ytc_UgzJNFXti…
Comment
Funny thing is yes, if they are thinking feeling beings they deserve rights. On the other hand if you have to treat them equal to humans it pretty much removes the robot's purpose of cheap, consistent, reliable labor. If a robot gets breaks, wages, has the ability to do sub par work or slack off like a normal human, why spend large amounts of money to build one when making humans is so much easier? There becomes a point where if a robot becomes too human, there is no real point in being a robot any more.
youtube
AI Moral Status
2017-02-24T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh0c4l23P6EYHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgizmdfK6BHeengCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiS9-lmbu6FW3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggg7_XeDnLEkXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjlRCoviv8l7XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgiAi7l2Sx79l3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiM-TwLKWJZ13gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UghE_QrjN0MWgHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugi_n0NFADJiGngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Uggf753UlzgQ93gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]