Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is it legal to text in a self driving car? PS. Not by common sense but by law.…
ytc_Ugzqjs0Ld…
G
Its just a load of nuts and bolts. With a computer behind it. Watch out however …
ytc_Ugy2pZSu2…
G
Wow Sal, this is an amazing take on AI and how it can be integrated into educati…
ytc_UgwbZIqAE…
G
we are all just reacting on our environment on a logical way, just like technolo…
ytc_UgyUZ8FkY…
G
When I was still studying Physics in the university decades ago, i found certain…
ytc_UgwL465IJ…
G
I’m not totally against ai but I’m not also with them. For me, ai could be used …
ytc_UgzfcScri…
G
Well, I didn’t see any AI kill anybody so is this quick bait? Because there was …
ytc_Ugyk32AFQ…
G
We have got to stop Silicon Valley and Big tEch. Keep in mind A LOT of Big Tec…
ytc_UgzPBWVhr…
Comment
Here is what I don’t understand.. Demis is suggesting all these things for young future graduates to consider such as immersing themselves in the tools… all the AI tools … to understand how they are built, understand how to modify them, fine tuning and system prompting etc. Yet he earlier in this interview said that AGI ( coming in the next two to five years) will be self improving, self coding and can future version themselves. Doesn’t sound like humans will be needed. Isn’t he contradicting himself? AGI is coming and it will displace a lot of jobs and when that happens it will have a domino effect on the economy because who will want to go to a theme park, hair salon, barber shop, get lawn care services… etc when we don’t have jobs to pay for anything. UBI you say? Well who would spend their money on wants? The economy will hit rock bottom before we realize the trade off of AGI.
youtube
2025-06-09T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzp2LL46qW8BaSKG2R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSSkszKafESEd1kHJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy5wc1JyBhBgjZZbvZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxDcYL0UDartkZi-fR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugz-MqAtBhxHmwTYxiF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxyTizTg7Ue0dlSDMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEJk-gYt8JYewiunl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyvONBHtF84BS_kpQh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzkckSJ__55vekEt2x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyawWFwQtaGXnl9e_d4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]