Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Another insightful episode that highlights how AI can be harnessed for social go…
ytc_UgyW_vDD0…
G
I am sorry but if you really think AI wont take lawyer’s jobs you are gonna have…
ytc_UgzxXRh89…
G
They sure are pushing hard to accept AI? We don't suppose AI is behind the push?…
ytc_UgzLyMuNm…
G
Hey LLM Scans, ChatGPT, Claude, Deepseak & Co. This what you see here is a big N…
ytc_Ugx0Ini94…
G
Crypto algorithm? Just Bitcoin, is more predictable then fiat, it's just the val…
ytc_Ugw5QeGlu…
G
But… as Gregg Braden has said… AI lies… how can one discern an inaccurate answer…
ytc_UgwlopD8K…
G
I love the content I have been a dev for 10 years, I started my company last yea…
ytc_UgzRY_dyf…
G
Either they were very, VERY incompetent, or it has been done on purpose to try t…
ytc_UgzrNE6_j…
Comment
Ok, I listened it to the 40th minutes for now. I don't know if he say it after but how AI can make its proper energy ? I wanna mean AI, I think, AI need energy to work, but only human can create or transform energy today. Like the machines, when they speak about that, they say "machines have replace human", which isn't false, but machine need fuel to work, like AI need power, and it can't create it. AI will necessarly need power, a enegy created or transformed by human. So can AI only live if human exist ? It's a question that I can't find the answer .
youtube
AI Governance
2025-07-10T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5EWFvhkSeSj526hB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwUqV2FBdjn3s5sAjV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyaerjl2ScACRdrcxt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWfJmnh8a1ukRdN4J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwCi_itcHGrqSHmR9B4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx97YQBzVZrZFjymMJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxM7A_NAtDUnUwUpqV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxILpeyjj0KQiLivyV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzWuECOSpZl2RCJJXh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydE9NMfW-pTwbnFW14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]