Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is meant to increase profits of corporations while taking away jobs of common…
ytc_UgwdoVsKs…
G
I agree, companies need to pay us for our creativity. AI cannot replace that. Bu…
ytc_UgyiVljnX…
G
@passwordistooken we already do but most of the time it makes mistakes. The onl…
ytr_UgzXe2jGc…
G
1 of 10000 ai generated pictures are just "ok", not even brilliant, other 9999 a…
ytr_UgyrpehYU…
G
who makes this killing weapons like AI, Robots , nuclear bombs, guns to kill hum…
ytc_UgzWIrcQ5…
G
It's hard to imagine anything more stupid and totally situationally unaware than…
ytc_UgyAh9FaV…
G
@iskawhiskers Pretty fair argument. I'd say at that point we'd have to prove a d…
ytr_UgyvUiKAg…
G
The question isn’t, will AI destroy humanity, but will humanity destroy itself w…
ytc_Ugymtilsh…
Comment
1. He's not just worried about AI. He has been worried about AI. This goes back to at least 2017. There have been a lot of people who share concerns and are building a framework in order to make sure it stays safe. 2. The title of this video makes it seem very alarmist when it really isn't. 3. Read Life 3.0 by Max Tegmark and you'll have a better mindset about AI overall. I'm still learning too.
youtube
AI Governance
2023-05-22T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwGezvMERvOYerJNU14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyrSqlL97M_vq4fCGp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyAg_bP-slA3rVgp354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxgBcQbSO02sy9Plh54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzKVjD08imERnkYsWN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzP6J3ufDf3qsjjDLV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwlOFIrkdnbd5VMq4x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRsN9qGFh1XpK76114AaABAg","responsibility":"government","reasoning":"mixed","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugygs9-wIqGHlGDMNod4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyedoaoXD2uTa2R3Mt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]