Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its AI generated 😅 providing sources is not part of the pipeline
(I don’t disagr…
ytr_UgzRKRRL0…
G
Well, that's how we get to the future, by automating things that we don't need h…
ytc_UgxmC1j8C…
G
Our current approach for AI will never be appropriate for engineering. LLMs are …
rdc_m27rpuu
G
@SurprisedCells-ju4ig Thanks for your comment! You've got a point there – fighti…
ytr_Ugw4zsj1I…
G
This reminds me of Russian war with Georgia in 08. Where their "justification" f…
rdc_cfky5v3
G
Let’s replace every menial job or task with ai workers so we never have to work …
ytc_UgwiAS4ML…
G
If YT wants to use AI for performance increases, such as storing videos at lower…
ytc_Ugw_UKG3F…
G
Let's think of that more in a probability thing. AI is based on probability. The…
ytc_UgwoFCxPs…
Comment
He’s delusional. ChatGPT doesn’t remember what one tells him if the situation you are building isn’t something he has often seen before. He has lots of troubles with social spontaneity. If you rely on his judgements, you are basically building your own social abyss
youtube
AI Governance
2025-12-31T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugym9ohMefdr3NBkIq94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx3xTJgDfXCx269mmN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxslI1nvO3Q7ZU4evV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz9GZi5gcKUUY7kvuZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxmo4ZqXsxDZL-vn414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugykimhl874RMcy6aOp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZBia6ojYYKc9_Tmt4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxztAjw1G5UxnteRlV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNLh7ASyRH7ywQtXR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxxai-ozeJcpe4dLMh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]