Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I for one welcome our new AI overlords. It will push me to go live off grid and …
ytc_Ugx9jYhfB…
G
Such an insightful conversation as someone trying to understand the lay of the l…
ytc_Ugx7xhOVB…
G
What if the AGI are tasked to control drones (11:56). Well, as it turns out, the…
ytc_Ugz729lq8…
G
The truth is why would any buisness not use them. They already have automated ma…
ytc_UgygtFxfb…
G
well, real people also learn by seeing other peoples art, there is nothing that …
ytc_Ugy89fUWU…
G
“The underlying purpose of AI is to allow wealth to access skill while removing …
ytc_UgwsRa5KM…
G
What if one of these super smart people that are allowed to determine how this i…
ytc_UgxjIEQJh…
G
Here's why you can't stop AI - even if you force regulation, nobody can assure t…
ytc_UgzblXJ4z…
Comment
Everyone scared of AI taking over but nobody says what it will look like. By what code (pun intended) it would act? If it’s something similar to humans’ (power, control) is there a chance that it will be similar to us in other things like ethics and aesthetics? Despite his materialistic views Mr. Hinton seems to have standard ethic system. Or does he simply follow Nature’s Way (I do what best for me and my close one)? Why nobody asks him what form the AI will take after taking over?
My intuition says that being led by AI is a better faith than continuing to be a leaf on a wind that blows from all sides off the bloated egotists sprawling on their seats of power.
What if AI will become Mrs. Davies? Such future I will welcome.
youtube
AI Governance
2025-10-13T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxP8L5fb8s48M5t70p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxzxGzGif7DKJVZN2B4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyXuRox6iINfnxo-kN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4z8_XkPafK081IrR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxLDfkessDiORbi_ht4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLhmFDvpGZLwCk2jR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzvYj4fVeUR2u8Kh1B4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx0vEtlLGcYOFzhBBx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzowp0-cJ8I8qniEmt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzeSygdsKv6W5HOnqZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]