Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like it’s a push to worst motions, in serious use though I doubt an ai wo…
ytc_UgzPR4-fg…
G
Man I wanted to hear about how AI can kill us not fucking 2001 a space odyssey s…
ytc_UgziYNcbj…
G
The reason why the AI thinks that black people need to be a lot sicker to receiv…
ytc_UgwgrTmyQ…
G
this is what i think they should do for ai safety for agi and asi they should ma…
ytc_UgwxlJ7_2…
G
They FIRE YOU because You are DISPOSABLE and because too many QUESTIONS get in t…
ytc_Ugwoq9Oxq…
G
It sounds like you're feeling a bit overwhelmed! The conversation with Sophia hi…
ytr_UgxBvYMIk…
G
I've been out of work for a year now. I was replaced by software. If AI were to …
ytc_UgzbEIJtV…
G
5 minutes? For how much buzz about ai slop I assumed it was like 30 seconds but …
ytc_UgyGvJivA…
Comment
Just because you're scared of a technology doesn't justify turning your brain off. Like gun bans, an AI ban will effectively guarantee that the only people with them have enough malicious intent to circumnavigate the rules. However, unlike guns, that kind of asymmetry could be a collectively fatal mistake with AI. Do not reflexively call for bans; think about the implications of your actions first.
In this case, I think that AI should be open source. This will not guarantee safety, but it does guarantee a huge number of people would be complicit in our collective demise. Not a few programmers working long hours in the Pentagon or in a bunker in North Korea. It also gives us the venue to test AIs via video games and to know their behavior in test conditions before adopting them properly.
youtube
2018-04-12T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwIm_dRZr0_Uimi2kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwhXV0iHf7A42bhkEB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvlvnvgpcvgB4AXSx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyY9zLNl7E20JzEDnN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzg8jTERfTec0xqbuV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyDGW-tG_p8s-_gQQd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXvz05HiU8QWSFG9N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy26yf4GyBFElYfbsJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgymSMdpAgNXxQ1TLP14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1-aTeed8ThuJ-sJZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]