Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I miss when people were accused of tracing and “copying someone’s art style” ins…
ytc_Ugz-dTwtg…
G
What if ai becomes so smart that we would have to put our money into something t…
ytc_UgzA_2zA8…
G
Lul, just because companies are replacing customer support with chat bots does n…
ytc_Ugz9af74f…
G
I'm trying to figure out how to limit AI slop... and really am disturbed to hear…
ytc_Ugz7fVsIX…
G
He keeps saying ‘we have to stop it.’ Who’s ’we?’ What power does the ordinary p…
ytc_Ugz4mhFaE…
G
It wouldn't be possible as future's robot will work on concept of humnoid ai mea…
ytc_Ugzq7l3Kb…
G
Madagascar is one out of 54 (or more depending on whether you count disputed ter…
rdc_dpc857d
G
Hahahaha best example how you can't trust the news... everybody knows the ai is …
ytc_UgymKLEw8…
Comment
With respect to this conversation (roughly 42 minutes in): AI is not the first tech developed explicitly to be better than human labor. That has been the case with most tech throughout history and most obviously on an automotive assembly line. The problem is not jobs -- more are always created by tech as Bores' Profs correctly state. The problem is the quality of those jobs. D. Graeber got this. (As did his teacher - M. Sahlins). EDIT: It looks like maybe you get to this issue later on. On the whole - thanks for the conversation.
youtube
AI Responsibility
2026-04-21T18:0…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxdlxNlMG7JZ_g1Q1R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIvG5UG39OHmvfBRt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx9yegn2CggmQaxgS94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQ3Ihl32s4wHScXAh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwMC4K_HBc5s_QA8V54AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZFEbG6SMOW4LvNjl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugwa0_-Wu5d_T7l6jNp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxQFnAxaik9gdosJJV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz1jTThpEism_zuPOd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwKEzwBsfbejQpOcIR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]