Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is probably safer than what the humans did/do. I'm all for the AI restructuri…
ytc_Ugzi4FCIm…
G
The Departmetn of Education... Since what OpenAi is and the use of it in Educati…
ytc_UgyBfdMwi…
G
No, I would say it when you combine both of them together and get law-enforcemen…
ytr_UgxASHupX…
G
Omfg I got a google ad right as they revealed Magellan was owned by them.
Once…
ytc_UgzAe_HIB…
G
Darling, you and all the artists online please keep poisoning AI.
You're absolut…
ytc_UgzcQnhgW…
G
Yeah, if you think LLMs and chatbots have consciousness, you don't have any idea…
ytc_UgzqiAT0Y…
G
When i heard of AI i knew it would be everything he said, us being obsolote howe…
ytc_UgwgF3q6y…
G
@jessicawilliams2726 Hey there Jessica, thanks for commenting! Well, if you give…
ytr_UgwpdvvCf…
Comment
@ryshask Probably not.
Just listened to an interview with longtime AI safety expert Stuart Russell and he says the CEOs of all these companies know how dangerous their products are, placing the risk of human extinction from them at between 10% and 25%. And they are still forced by US law to go ahead with development.
And that is a conservative estimate of the risk, some AI safety researchers place the risk much higher.
In not long at all there may be no people left to remember the folly of training AI to human level intelligence.
youtube
AI Moral Status
2025-08-08T00:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgymKUxTQvRsRPBLKOx4AaABAg.AItS8wjfmhOAJYVaapToHc","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzUXHPpLN4hGUiQ-uZ4AaABAg.AIq67DJoPDVALYDZNhBgsq","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzUXHPpLN4hGUiQ-uZ4AaABAg.AIq67DJoPDVALYFoM5mOw_","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzUXHPpLN4hGUiQ-uZ4AaABAg.AIq67DJoPDVALtLFtxjDtM","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzUXHPpLN4hGUiQ-uZ4AaABAg.AIq67DJoPDVALuUKifpNF1","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxdXusJM5JguuH_hJB4AaABAg.AIoF6zbCq-9AKADjplvz4R","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxdXusJM5JguuH_hJB4AaABAg.AIoF6zbCq-9AN-4ir-fbAr","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxDGLuInGOzKP0w6lR4AaABAg.AImVtyMULi0AItNUGzJCkP","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxDGLuInGOzKP0w6lR4AaABAg.AImVtyMULi0AIuATGXccAE","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxDGLuInGOzKP0w6lR4AaABAg.AImVtyMULi0AIuuueyLQJg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"}
]