Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
so essentially exporting fracking to the world. damn. and we talk so much shit…
rdc_d0fhcn2
G
"but can they ever outpace human imagination?"
uh by they you mean "humans"?
t…
ytc_UgyuMjJyf…
G
It's almost like the people who set the guidelines for the LLM's are importing t…
ytc_Ugx3bE2H0…
G
welcome to dystopia y'all. This will only get worse as AI gets better. better ju…
ytc_Ugx3vbR72…
G
Don't worry, they will, give it 10 years max and you will have robot police know…
ytr_UgziwQpxN…
G
The fact that his brother is Jazza makes this entire video hit exponentially har…
ytc_UgyTUgBe8…
G
Statistically, driverless vehicles are FAR safer than human drivers. Look up the…
ytr_UgwmESAAa…
G
This is a based and true take. LLMs are a probability machine trained on some go…
ytc_Ugzcxr2o7…
Comment
I'm starting to believe that whatever possible benefit a truly thinking, feeling, and experiencing AI could possibly provide to humanity is far outweighed by our almost inevitable demise from it. In the interest of our survival as a species, or at the very least as a cohesive civilization (as pointed out, it wouldn't even take a conscious AI to completely fuck up our entire social structure), AI development really should be halted until we know what the fuck we are doing.
youtube
AI Moral Status
2023-08-21T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySZ6aLxO7ZpreByjx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhpURSR2IJEDSpv494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWgr1V5d5bs3tppft4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwhkKosGzX3vt7JSYR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6l9iUT3XAEriWuNF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzK6rJckH_Tb0w0wqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzQ2GXzis34278cFMZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPTds8zGVirYTk1hx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxotep83lhNTvUs1cF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKJBMcpWtO68-y1qV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]