Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This will be a regular occurrence for anyone who uses AI to just breeze through …
ytc_UgyFX6gJJ…
G
My work is future predictive algorithmic data modeling of economics. My models …
ytc_Ugw7rIu5F…
G
Those whom create, maintain, lobby, and implement, must be taxed and regulated a…
ytc_Ugy_cXgSo…
G
Altman trying to drop banana peels to the other racers in the AI Mario Kart.
*…
rdc_jkh6ggu
G
why must robots look realistic? they should should concenrate on building them: …
ytc_UgzGOsPxd…
G
On a slightly different note, but the same. Nobody can fake what you see in anot…
ytc_Ugz7vNNDW…
G
this isn’t some future threat. AI tools are being used by militaries to kill civ…
ytc_UgyA0IxXa…
G
@douglaserb1 It is. Judging from how things are developing in China, it will be …
ytr_UgwEvVbag…
Comment
AI Act contains various important points that must be known by individuals, in addition to AI technology producers. These days Personal Data Protection is on the top talk. Cybersecurity and Compliance professionals need to perform effective and relevant AI risk assessments. AI Act is about safety including data safety therefore regulations compliance and risk assessments are now the needs of the institutions. There are different risks like UNACCEPTABLE RISKS, HIGH RISKS, and LIMITED RISKS. Deeply understanding these "Risk Categories" in AI Act may help in reducing the risks of reputational and financial losses, that may be caused by the misuse of AI technology.
AI Act should be read for more details and understand the roles and expectations of AI technology producers and users.
youtube
AI Governance
2023-09-18T16:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzraShlI86E_-lCZ0F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz91RouMvvzXJtF_gF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBva5G-p5XIxJklNd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx1EnMGyFfWKPfbx814AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzKdciRpcelhP2UdOx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxh7kabQnABryH-bLx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw3r3Txg45vREnXNhh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwmh70SaxUzb2RBpK94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzOODS3dPQ4EZFL1Bl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgySIu3ruV5qFlK5oFB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]