Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Where we need to be worried about AI is the unintended consequences of well-inte…
ytc_Ugwm23aCe…
G
Slender Mane They have, but once AI is developed either by a couple of intellect…
ytr_UgiOhCIYT…
G
If we could see education go from what it is now creating cloned worker bees to …
ytc_UgyvFtxXS…
G
There won't be any real "AI". Just like there was no real "Vaccine". However, wo…
ytc_UgyZbGK5O…
G
I dont know why there has to be a versus? Just because Elon Musk says so? This w…
ytc_UgxsneXLy…
G
Humans : Background check, red flag law, training for months.
Robot : Here ya go…
ytc_UgxoXxH7s…
G
The whole point is to lower costs so products cost less to make. But in reality …
ytc_UgzRxljvn…
G
How about instead of steering the public attention to AI, resolve the situation …
ytr_UgwgXlBQO…
Comment
Basically no one ever got the timing of technological breakthroughs right, and Hinton was no exception. In the long run, when AI become insanely superhuman at radiology for instance, the human radiologist will be a bottleneck and a liability. Superhuman medical AI will be the easiest sale ever
youtube
AI Jobs
2025-10-27T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwupOpXypLLJwXghiF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgznOf_nQnpdAnyMyb94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwlBZ7rWkhz-QZIDnh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgySmuUKZjFr5odqWNh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyb-d_-38Apr_wiaY54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9ZXdEfmsJivucKL94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwr0NyYezkSmv5qW1l4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx0bMvZDSMLbX4x_wd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxa6PslC8k9H6b1-014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFqHKL48Rwa0AaUot4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]