Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You could say I'm clueless for having this belief, but I don't think it will eve…
ytr_UgzVqCYWY…
G
I think that Musk’s desire to silence. The NT Ssa is a major reason why he wants…
ytc_UgyvFlVbz…
G
What’s going to happen is we’ll live in AI-driven socialist societies where work…
ytc_UgzntxqJJ…
G
I look at it from a more mundane perspective. Random person wanted go to another…
ytc_UgwJareZh…
G
While I think some artists are more naturally talented as art comes easier to th…
ytc_UgyeEWe5Z…
G
I bet AI bros ask professional artists for commissions and then say that they ar…
ytc_UgxnMtGUU…
G
I can’t draw for shit, so I could see myself using “AI Art” as a tool to help vi…
ytc_Ugwx6B5ks…
G
That's a lovely name for a car! Sophia is a name that signifies wisdom, which ca…
ytr_UgzqZX0zF…
Comment
AI needs huge processing centers and power, I feel more than supply of hardware, the supply of energy will be the immediate problem and this is the only reason - if any - will slow down the widespread of AI and robotics imo. The cost benefit of AI services will depend heavily on the cost and abundance of energy and when ever it becomes cheaper and abundant, human services will be taken over. World will be completely different in 10yrs and probably domestic humanoid robots may compete with car market for units sold. I think it will be very interesting if you could bring an AI 'economist' to podcast, to discuss how economically practical the commercial launch and spread of AI and mass scale robotics. Because we know AI is coming, but we need to know practically how fast it will come.
youtube
AI Governance
2025-09-05T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxVt4kxkFLmEpntZNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoJYOXdh8tHFDyg0t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx28lb4tkC1xZ4N9wl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlfgElt0_jl7yN7954AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVKn0iM4d5w76c7rJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxD5tcQV5p4IvW1Qo94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzwo3t7ymIzX-SXaxx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxbBuxOvm90Tf3cC-J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyQnDSNu1S_WQn1Fld4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyDmG3y8I6nWWQ9Cq94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]