Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Three talented persons from different companies embrace this and started a compa…
ytc_UgzI54TkD…
G
What's the point of this video ? It honestly feels like it's just trying to tick…
ytc_Ugz4qgZb0…
G
Plumber? People need toilets, showers, food prep areas, etc .. what do AI's and …
ytc_UgzzQeZ0G…
G
Yeah, I predicted this would happen years ago when chatgpt first came out, it's …
ytc_Ugxauyz2W…
G
20:14 Sorry to tell you, but since March of this year AI develops AI. with decre…
ytc_Ugy1Sq9Bf…
G
It's just one ai art in a whole sea of them, why this specific one…
ytc_UgwU9W6Ya…
G
You all really need to see what's really going down from a detailed historic per…
ytc_Ugxmo0bmh…
G
So, the AI learns from data sets, but yet you cherry picked a few stories to poi…
ytc_UgzTzzQY1…
Comment
13:56 What I hate about this topic is that, my answer to that question are artists, you need humans to make art, no matter what, yet it's the first field they want to get rid of, even if it's the only thing a human could truly do and benefit from in this scenario.
I feel like AI is being used on the worst places possible. Maybe proffessors only do lectures, but it's social interaction that is crucial. If AI can teach us, then we don't need school, and then we don't need people either because an AI can be our friend, right? AI might be smart, but it can't replace all humans.
youtube
AI Governance
2025-11-17T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw7MUUT0JXEFAQT8gJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxxmr4pFpBt2RKaQqB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzz7jC1LRYHsC0tJLV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnEpOpEOaEkQWDDd54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0NwBbht_qIudIB8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzt1tsyBvKbcYWdEDB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzKFIh41Wvl1TA1d5B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytrFDpxpFTlon6RPV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzSIv1oeeZG1U1ZRWh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNt5xiBphu9KzXuIl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]