Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use AI in my job. It is helpful but it isnt great. The hype situation and outc…
rdc_nns8c1z
G
@JeffBilkins As opposed to all the inhuman drivers licking their lizard eyes? If…
ytr_Ugzy90w21…
G
No rights for robots! They are meant to be used!
it won't be much of a deal, we …
ytc_Uggixgp2E…
G
When an artist displays his art publicly he consents to human analysis, praise, …
ytc_UgwSswaQ7…
G
How many jobs involve, making, distributing or fixing junk that was designed to …
ytc_UgwKlNSku…
G
i hate ai prompters, what ur doing is NOT ART what ur doing is stealing art and …
ytc_UgwSkNPWU…
G
Why would you even fight a robot like that. Even if they did not program it to …
ytc_UgwA-WL2f…
G
Hey Charlie, sorry to have a nuanced opinion:
"it takes talent to use camera" i…
ytc_UgyNTl3SF…
Comment
I may not get it, but the occupations that are in danger to be transferred to AI are mainly the intellectual ones. I hardly imagine how a program can replace those basic occupations, i.e a cook, a baker, a gardener, a farmer, a baby-sitter/kindergarten educator, yeah a plumber or a janitor/a cleaning person or a nurse or a beautician, you know what I mean. The list of first hand occupations is long. Then I don't understand what can I do as a simple listener to a podcast, to modify this deadly trend. Why is he talking to us, lay people and not to the developpers of these AI programs?
youtube
AI Governance
2025-09-06T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyfi_vIfD5HJDCfyCB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqV0OvTNM3wGNr2694AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzUp2CR1zwnfKjOY294AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgziwU0Yd7Bw39vMPyx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyjfjwTdPr3e7CFFTx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw-Sf2cvRzTvIFcdsp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_QCgfiFOIYiRhvWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyeCxNZn3-HRX0mNBt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxxylKtLDXXwrh8TFB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugyg_81S1Gt34F4KoUt4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]