Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
a person worked really hard on that art piece but the company’s don’t give any c…
ytc_UgzNUOk73…
G
The only reason why I hope Oriana didn’t know about her boyfriend selling AI slo…
ytc_UgwK1SLRx…
G
Venezuela is poor despite oil because communism and nationalized oil corps. we’…
ytc_Ugw18rEHl…
G
Geoffrey Hilton: AI, what's going to happen in the next few days?
AI: You'll get…
ytc_UgxqokxSm…
G
There’s so much missing from this interview. Obviously they only have so much ti…
ytc_UgyYsKh6_…
G
AI will be the end of the human race. Has anybody heard about the robots that we…
ytc_UgzX4P-T6…
G
Let's not forget that self-driving cars were a (dangerous) failure. I see AI rep…
ytc_Ugy9p_1C5…
G
Karen is so well spoken this is great. Thanks for this interview, ill check out …
ytc_Ugy2vsP7r…
Comment
I get the feeling people who say AI does a job better and cheaper than humans have never actually done that job or tried to use AI to do that job. A lot of the stuff he says can be done today with AI just can't be done today with AI. Maybe in 5 years if we have better AI. But not today.
youtube
AI Jobs
2025-09-25T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwKuvTBRPBy_c4Bnv14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyNfUfFQg5urBb-bl94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwJsQualhwezHRqlGx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzqKxrWwZfwoFYZhY54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxw8UX8rHal5hqoQpV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRZ5pdWmHc0UO0q3J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgykEBHw5mrzbfH3eiV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxC3WcQmD-EY4SVwVB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzhZSN13CqU0jgbWTd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzg3kWxeMc1NDvdQFN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]