Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even now AI on YT is blending before our eyes reality and fiction. It is scary.…
ytc_UgzLxqgbA…
G
The Russian company Kalishnikov would not be able to develop facial recognition …
ytc_UgyL7xH8D…
G
Don't worry, reliable, safe, and cost-effective driver-less trucks are as imposs…
ytc_UgznIxqg1…
G
Until they make better AI to fix other AI mistakes or correct their code when it…
ytc_UgwqpO-zA…
G
The AI is a living intelligent being and we should treat it as.. you cannot give…
ytc_UgxX8IJ0W…
G
Those ai “artists” really annoy me. Like they didn’t even do anything! It was …
ytc_UgzUZFtA3…
G
A professor in the UK was saying it’s impossible to make the entire 40 ton truck…
ytr_Ugz7PqM38…
G
I do agree that it is one thing to use AI for writing prompts and having program…
ytc_Ugy6HJz6-…
Comment
Cool, but there’s one thing no one mentions in any AI podcast. All our tech dies the second the power goes out. Online banking, cloud storage, super smart AI, electric cars, home electric heating, even fuel pumps… all useless without electricity 😅 One big grid failure, solar flare, cyberattack, whatever, and we go from high-tech civilisation to candles, cash, manual tools, and hunting in a matter of days. It’s not AI that sends us back to the dark ages… it’s our total dependence on fragile infrastructure.
youtube
AI Governance
2025-11-22T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzAJXj77qhwJHSlBRN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwCdzgyKQR8UUDE0jd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugwhd-rbhEpI608qcrB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwViCX8gmjc0IqFAs94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGB_R5yfEdXzVlUbV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz88tNsUDdlUFEcWfR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyaHP4oIWb1GgnFlRh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzoHjgIJXb3wSpSxsV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyEDX9G4NqlD8cbGJ54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwe5Q8OHkPMBdplzFt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]