Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We’ve already seen the sci-fi movies from the matrix I robot ghost in the shell,…
ytc_UgwUM4b2Z…
G
AI art generate random. Traditional art still required for specific shapes. Comb…
ytc_Ugz8FMmz8…
G
Personally I believe calling your self a AI artist is like commissioning a piece…
ytc_UgyqKc_Di…
G
@sgtdrake Using it as a tool implies you're using AI as a helper. When in fact …
ytr_UgzNYUd1V…
G
We appreciate your concern. If you'd like to learn more about AI and its advance…
ytr_UgwYj4BCW…
G
Here is the question: Is there a 'Lucy" effect. Humans currently use less than …
ytc_UgwnqwK9O…
G
is it possible that corps are using AI as excuse to lay off ppl that didnt have …
ytc_Ugzfc52Bl…
G
@Christian-w8q dude llms by itself is just one technology that's a part of gener…
ytr_Ugz_tglHd…
Comment
Many good points were made, but some parts of AI were left out. He worked at Google, a company that profits from user data, so he likely knew the risks. Living in a highly capitalistic country, it’s hard to avoid the pressure of big business. Still, he made an important point. AI can be dangerous in the wrong hands, but if we train it to help people, we can reduce the risks. Thanks for the video.
youtube
AI Governance
2025-06-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyufjiCXOAq61peCFN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUtEL8rK925dv0kKt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzQRmTy62MN9QpI3Cx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYSJvUhdgg_5Nnv5p4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzbGEhlkSFf7TjN-qp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwqLWi-mGspTNRSzmh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgywplESWDm1hTekgxt4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7zRr76_8pZ3z2fdd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz7rlydfJ6iqdgsBpZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxncn8Hb0xce6oeer94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"}
]