Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
we need to force some version of the 3 laws into so called AI, and bar it from u…
rdc_o791eqc
G
For me, this is a bubble, a huge one. People are reliying on AI as if its someth…
ytc_Ugymp3_cw…
G
My second take-away is: what model of car do you drive!? I've been looking for …
ytc_Ugzn9NtEn…
G
Humans biggest enemy is a human. only AI will destroy people lifes brains and we…
ytc_UgzEXf-Bk…
G
Honestly, AI isn't the devil. it's just a tool. can unethical things be done wit…
ytc_UgwpoL8RT…
G
My mate the other day said what's AI for, trying to explain, well what i can und…
ytc_UgyS9uE4E…
G
Not only chatgpt, perhaps in most of ai products in the market if they sell fact…
ytc_UgxDfG3jk…
G
10:39 this is the first person I've ever heard say that "AI is going to magicall…
ytc_Ugy17BdUh…
Comment
Usually, I’d be worried about stifling innovation, but honestly, the AI development industry desperately needs to be stifled, at least until humanity figures out what we’re doing.
In the end, I doubt this will have much of an impact on the global AI arms race, but it might just slow down competition just enough to allow AI companies abroad some breathing room, in order to implement better safety systems.
The EU AI safety board should also act as an advisory body for foreign businesses, before their own country’s government gets their shit together, as this is a truly global issue.
youtube
AI Responsibility
2024-09-23T09:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw1-kDZfUOgwn9Xmf14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZD2nmS4Njfg_0HKd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyOKjtBE92ElsntcFl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4Oj8yRRp0Rb3hJnp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwWzMwtOexLgKkZzg54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy4b6M9EZKJ9fuey4p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxGqgg7BN5t7zT-MAl4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwfOvGoqQe5Pj4RndZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBxwf47HDfUH4QJoB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwW9m4xKh9Yvjn9RrV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]