Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
that is absolutely terrifying! next they will have ai change the text of article…
ytc_UgzY0rEAd…
G
"We stole the entire internet to train our AI, how \*dare\* you let a Chinese co…
rdc_m9gid3o
G
Short term memory AI which is been used constantly is in the development stage o…
ytc_Ugxrl-WgG…
G
We don't use algorithms because they're better. We use them because we're lazy a…
ytc_Ugw_BHceC…
G
We know the mechanics of how we built it. That is not the same as knowing why it…
rdc_j8vy9ea
G
I will worry about AI when it wants something. Like when it wants to fuck or eat…
ytc_UgxXr87X9…
G
SOLUTION: As people in America, and eventually around the world, lose their jobs…
ytc_UgyWeDw2V…
G
Do a video about the problems of AI and a potential robot apocalypse, and how we…
ytc_Ugj4vFwy4…
Comment
I wouldn't say good or bad for europe, I would say it's necessary for europe and the world. In my eyes, this is only a rough framework and still needs a lot of work for us to be safe from negative consequences. We need to not only regulate the implementation, but also set limits on the creation of AI. With AI getting more sophisticated, we have to take ethical considerations. I don't want to see a future with a handful of corporations dominating markets by use of proprietary AI. Transparency is key.
youtube
AI Responsibility
2024-09-23T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw1-kDZfUOgwn9Xmf14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZD2nmS4Njfg_0HKd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyOKjtBE92ElsntcFl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4Oj8yRRp0Rb3hJnp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwWzMwtOexLgKkZzg54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy4b6M9EZKJ9fuey4p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxGqgg7BN5t7zT-MAl4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwfOvGoqQe5Pj4RndZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBxwf47HDfUH4QJoB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwW9m4xKh9Yvjn9RrV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]