Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I like that for many people the concept of "work" is reduced to working on a com…
ytc_UgxW5Regb…
G
I have an infa-red camera that can see like daylight .why do they not use them. …
ytc_UgwxBsy9i…
G
If they wanna get facial recognition right they should ask China....... they hav…
ytc_UgysNHOtd…
G
He lost me when he said we are in a simulation! I agree with the risks of AI, bu…
ytc_Ugwx49GKJ…
G
I'm sure he knows, no one will slow down. I just hope the US's counter AI securi…
ytc_Ugw_HBqgM…
G
i tried to chat with AI using the same rules and qns...now I'm freaking out…
ytc_UgyiXpfSG…
G
Doing a report on self driving cars struggling to find content then I watched th…
ytc_UgzuLZK7g…
G
Ha, yes, that will become a problem. He's been voicing our videos for years (bef…
ytr_Ugy01BtmX…
Comment
There’s so much missing from this interview. Obviously they only have so much time but the blasè response of “when automation or ai takes your job, find one that involves a creative element that ai can’t replicate” is absolutely bullshit. So many of the entry level jobs and jobs that are accessible to those most in need are vulnerable to replacement and there is no other simple alternative for those kinds of jobs. People need them for money and to exist, not because we want to exercise our creativity and find purpose in work. Tyson is also totally blind to the value of philosophy in a way that infuriates me (as a person getting their PhD in philosophy).
youtube
AI Moral Status
2025-07-23T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz9d6FyGRZ40uldQbB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1-lhBPCSDAc3hloF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzFq1DyfE22Z7fNsUt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugym36X_SkmSHa1zArV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzlwebDTaRCoosYVgV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyQFDLpT8XOmXVuy-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAPxa0IOyujo2DNl14AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyYsKh6_bbOXh54Fcp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxWeBhhKEUXeF0MLmN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwlIJ56sXAXOifqag94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}]