Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@user-vw1kv3xx1ename Thank you for commenting! I'm glad you enjoyed the video "Q…
ytr_UgyihXtfa…
G
One interesting thing about Tesla is that you assume all liability for a crash r…
ytc_Ugyk910TL…
G
Artificial Intelligence is not a thing. No matter what you call it, it is progra…
ytc_UgwEYUkNp…
G
Nothing more advanced than a 20 questions box that can guess what Pokemon you ar…
ytc_Ugzf2yvgT…
G
Why did big companies want to build all their factories overseas?
Because labo…
ytc_Ugxb5XW0l…
G
At this point I do wonder whether our form of capitalism is eating its own tail.…
ytc_UgzGuRe0v…
G
This is exactly what I was looking for! I've been researching "The Rise and Reck…
ytc_UgwmV79b2…
G
I’m gonna play devil’s advocate:
What if we let AI create art and learn to becom…
ytc_UgwfPYpH7…
Comment
I am not against new developments, even if I do not understand them or do not want to use them myself. I think it is fine that an AI machine may work 'on your behalf', that way you can also remain liable and responsible for your AI helper in a sort of sense. But I do not think it is right is that someone can have an army of AI that drives everyone out of the labor market.
youtube
AI Jobs
2025-05-29T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgybdNGFMHBSHimm81R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugy3XlVjwZArMx7Ki814AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgwwJMUlZAsRPUuCwol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugy5_GxDZxzHJ--rZ2J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwzFBefKdNjM8j4zJJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzt17qkkHg3dvma4XB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxVS9FCaJj8MOlr3zd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxKuVXI_aCCiGrK6454AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugy3BLIkZLpdpQOjiN54AaABAg","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"mixed"},{"id":"ytc_Ugz6chsffWG1St6dWVB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}]