Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Autonomous vehicles are a life saving technology, and whilst I usually like your…
ytc_UgwP3yFf4…
G
At 13:52, I totally get what they mean about AI art! I sometimes have these wild…
ytc_UgzR185Jj…
G
@Unpopular-World-Records just looked it up and it seems like a smart way to do …
ytr_UgwwtY-BZ…
G
9:30 Jeff don't have a $50bn but Jeff probably has his life on the line against …
ytc_Ugwtf-V1c…
G
The biggest risk of AI is the people in charge of it.... the tech oligarchs.…
ytc_UgzodcGKw…
G
Very bias spin here. Why do we let robots learn on the same roads as people? I…
ytc_UgzRm5_Hc…
G
Im not particularly impressed with her. I’m half way thru the interview and I’m …
ytr_Ugx0HtKUL…
G
AI can only do what it's programed to do. The AI wasn't creating problems it was…
ytc_Ugx4diO-6…
Comment
The question is, who actually wants this future? People see the glamourous side of futuristic AI from movies where they assist in all ways and do the boring jobs while the humans go to work and so on and it looks great. But in reality, it will be the reverse. The bots will take the humans jobs, and the boring jobs become the jobs for us humans. Leading to a vicious revolution of job loss and poverty. Affecting those who have lost their jobs 1st, but then affecting the millionaires of today that rely on those people to purchase their products for their business to thrive. Alarming. We had a great balance, where IT was there as a tool created by us to help and entertain us, not replace us.
youtube
AI Governance
2025-10-08T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzbcXCuHnjr2u7JPFR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx1OeyMIHrBRkFpJfl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyymEPoHjznUjS7nQl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzUkDFg4EAeRWiK9fp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxL6f6ezyQGRdN4lKx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJrOjz91_1GnEUOtl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHaUoBzlPN-YFDVz54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzRp2PkNfa7FFqpcnh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGv9E33z5PqYXASKV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxTtEkRTddB2gc--e14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]