Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The irony and karma is so poetic:
Labor jobs got made obsolete by machine automa…
ytc_UgwiBwP-N…
G
@Creating-AmericaYeah but there is a difference from not hard to getting a robot…
ytr_UgzBMwBS9…
G
On the other hand, Star Trek Holo-deck were AI art in-world and we never even di…
ytr_Ugzitk-Pq…
G
5:42 I agree that certain sw engineering positions will be around for a while de…
ytc_UgzKWiXLc…
G
Also, because the same tech that Wayback uses to archive sites, is used to train…
rdc_ohaqwmb
G
17:12 one reason is humans aren’t gonna build it for you…..fuck Ai……and social m…
ytc_UgxzBtDOQ…
G
LLM development seems logarithmic to me, not exponential, and I think we've alre…
rdc_n7qgpep
G
It falls apart when you start calling everything/anything "evil"
EDIT: this is …
ytr_Ugw5PnVG-…
Comment
It's fine to get mad at the 1st controversy but you can't do anything about it. Technology advances and people lose opportunity. That's how society works.
The 2nd controversy is the interesting one imo. I do empathize with artists who work hard to create a style to have it used to train AI. It's just difficult to find a solution. Some people don't care about ethics so "the honor system" won't help artists. You could make laws banning it. It will be difficult to enforce those laws. Very tricky topic
youtube
AI Responsibility
2023-03-27T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyOBlHak1ILz8zDWzJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxgw3wRb6Ww8BsFdxl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6-JBzyY-Ia6qw9Cx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgytMrtGOk-z86AVjZ54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugzy6MqjrAh3mqwMsjZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwnZz8l7OGaBzh1ig14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz8HTz3KDYsbkB7l0x4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyQfBDcLjH9eR5iirt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyKCg1hQrjnXksOo4p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFrrKzUg76jjoO8Wd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]