Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Exactly! What is even the purpose of robots having emotions? That's what's scary…
ytr_UgwFZw-TC…
G
AI CAN TALK TOO. SO YOU WILL GO TO MC DONALD FLIP SOME BURGERS... SPEAKING ALL …
ytc_UgwXzHGl7…
G
Interesting take. Is the mere act of putting a lot of effort into a polished sho…
ytr_UgwGBhE3D…
G
I don't think an attorney should ethically take this case. Under US law, the cod…
ytc_UgznJkP7h…
G
cause they can do it 10x more impact. more covertly intelligently and persista…
ytr_UgxnkQS7F…
G
Lex did not come across as playing "devil's advocate"...he came across as someon…
ytc_UgxBEKbWW…
G
"AI is a monster we should have never tampered with, that's why I'm continuing t…
ytc_Ugx38yVqK…
G
You are talking about it as if it has wisdom like an all knowing entity. That i…
ytr_UgzWWOZf1…
Comment
10 ways to stop AI from harming humankind...
1. Dont allow AI access to nuclear warheads
2. Dont allow AI access to banking systems
3. Don't allow AI access to medicine distribution
4. Don't allow AI access to news channels
5. Don't allow AI access to emails or social media
6. Don't allow AI access to satellites
7. Don't allow AI access to electrical grids or gas pipeline networks
8. Don't allow AI access to smart cars
9. Don't allow AI access to international shipping
10. Dont allow AI access to the stock market.
youtube
AI Governance
2023-05-14T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyGB9d8GNe2WCb82hJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYPKR9ejXxYYFZUoF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwa8ix7DPgdlwQxfdp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz4exDoWA8Z3WJInYZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgwXBylGlq7rLnLWD6F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztP0Hl6rAb39QQxgd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwvxjHTRkDcZNKqjyt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzMK-8wA1wtiI7LFQx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx1jwCaR6IwO6v47PF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQoNiWVsPD7FIz5w54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]