Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Haha, that’s a clever observation! Sophia definitely has some impressive tech be…
ytr_Ugzmg1vpE…
G
@asai_24 not really, no. we already know the difference between how what we curr…
ytr_UgyvUiKAg…
G
I do like A.i.
I know that it will take away Drs and Surgeons jobs,, (being able…
ytc_UgyVpmKUO…
G
AI will end up destroying just about everything including most of common humanit…
ytc_UgzrLtnrj…
G
Human to both robots: Blah blah blah ... any last words for the RISE audience?
…
ytc_Ugx4w2WfV…
G
13:02 Can somebody please explain to me why a program designed with the paramete…
ytc_UgzovDY7o…
G
Idk if I agree entirely that Pollock is bad or has no artistic value, but I defi…
ytc_UgyFE5Y7r…
G
This tracking feels superficial. I prefer the depth AICarma provides by monitori…
ytc_UgzWfU2hA…
Comment
A few thousand jobs that are lost might make headlines, but I'm not seeing this AI apocalypse unfolding. It could very well be the case that were simply in a recession and the AI is just....there. Adam conover makes the case that this is a big nothing Burger (not in terms of jobs loss, but just AI is impact generally) it is also been argued on the show that AI is an asset bubble.
I don't know if this is necessarily a canary in the coal mine but what we do know for a fact is that we have a long long way to go before a million subscribers are lost in short periods of time, if that ever even happens.
youtube
AI Jobs
2025-10-29T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugylur6S9pBqMeVZDwF4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzZdzbZrGd6A8Egduh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzwzNdkR9hRjzRR6bJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzfCLGvbfLenk3XYU54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzzP6tUZRUX1vtOfQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrrltRElGWSmw8fo54AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz85ssLNtBflfpKjkh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRpCuQ1yoN6qdoQMl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyD1niqLBRSWkFV-Qd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzzkWh6CnYjV9Am2094AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]