Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love that the intro to the interview was made using AI voices 😂 oh the irony 🙈…
ytc_Ugx0JLRij…
G
I don't agree, I have no programming experience but needed to code for a persona…
ytc_Ugz_8SZ2B…
G
I work in API security. I secure banking assets. I want to clear up the assumpti…
ytc_UgxbTggKK…
G
Paul Virilio said when we invent a technology we invent the related disaster - …
ytc_Ugzqugwni…
G
I ask a genuine question to the AI ‘art’ defenders: What is the point of art? Do…
ytc_UgwNfjdFL…
G
I think I own enough index funds to not worry. But probably AI should buy more a…
ytc_UgxXVhMih…
G
I have a suspicion that we exaggerate the sophistication of human intelligence a…
ytc_UgxNpRRFA…
G
Elon is a scammer so finding loopholes to make more money is what he's going to …
ytc_UgzyRG49C…
Comment
I mean if we get a super intelligence that makes people obsolete, and the people in charge are people like capitalist countries like we have today where they by law must go for profits and maximize profit. Then like in basic economics people will become numbers and a cost and they will always try to cut down costs to increase profits. So what's the point of for instance feeding and housing billions of people who's only existence is a cost to society since like mostly 90% or maybe even 99% of the people will not contribute. It's really scary to think what will happen at that point. Though of course if AI will always be under control and the people in charge never become dictators it won't happen but what are honestly the odds of that?
youtube
AI Governance
2025-06-19T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx1VycVHCGFi8bzAbZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugykh-a_TtmyX4KKr3t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw6M8lSbiH3hnrVIuB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx0HkYLdltjTrCeoi14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-IY1h8e9xOKmImSJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxYt01ZwYF13Vij5LZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxFV5avURTD3JZ7VmZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwqDaMXWrPFCIP3XLR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyKMV3gBeRpEmOpqeZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxxH1SiHBQFdpOQ34Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"fear"}
]