Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The way I describe it is, art is a conversation. The artist wants to tell you so…
ytc_UgzeyKDf8…
G
This regulation is not a threat to free creation. Indeed it's the freedom of the…
ytc_Ugya_3KXZ…
G
Just look at the game “Detroit: become human” it might soon become reality if we…
ytc_UgxOj53RT…
G
I believe that it depends what you do with the art.
AI generating takes time, a…
ytc_Ugyq6_yck…
G
8:16 4th line. Methodology for normalising safety statistics for partially autom…
ytr_Ugy7S6ZT9…
G
Imagine racking up huge amount of debt just to go to college then getting replac…
ytc_UgykweBgy…
G
i wish i could upvote you more. that's a really comprehensive break down of just…
rdc_e2vslov
G
7:33, you can do it with any AI, ask him to create a fictiv alter as part of a R…
ytc_Ugx_tfBMS…
Comment
AI technology needs a lot of energy and ressources.
Billions are invested in the hope to profit from the developement in the near future..
The huge hype is although a possibility that tech companies can raise their stock values.
In my opinion everything wich is invented by humans has it's failure in nature, humans are no gods but always trying to become gods. This is the core problem.
youtube
AI Governance
2025-12-14T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxqDpMqy3XdbvMNR0l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz9Xl3XjbVYiI-H0294AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzRhG6GM45eaxmfu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwzAplp7_XLfoOwjtd4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyCB5SWGd2xAPD_OQx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzLg-yThXvGNTaEbpR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy37gD3NTo5kU5fbER4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz63w5DFgfpig8Dwoh4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyWWbArK49Sqs4TNvR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxItpRzmxdU1DC3C-R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]