Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Long ago the seven nations lived together in harmony. Then, everything changed w…
ytc_UgzabKm5u…
G
If AI decides to wipe out humans, then there is no need for AI anymore....so the…
ytc_Ugx8gUK1L…
G
>>After that, you may agree to the AI or override it's decision. So you ar…
rdc_gd7gb4h
G
My teachers were truly bizarre human beings, they couldn't teach ethics. AI woul…
ytr_UgwK9suZP…
G
All these Ai developers are very dangerous people and need to be stopped, this g…
ytc_UgzrAthhz…
G
@HydraBeaVTnah, AI is free and only thing I need drawn is illustrations for my …
ytr_UgyS4CkjY…
G
But if they did make one you can technically use it and distribute it cause AI i…
ytc_Ugw5KiX8j…
G
There's a well-known story of the human on the operator who, on his own, prevent…
ytc_UgwSWH9eM…
Comment
In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman,
youtube
2025-11-23T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugy5LpqYocy-PR5pDIl4AaABAg.APr0B-13-VgAPrdIRSRZhr","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwZI8AJAJcMV756s5R4AaABAg.APqyLANJbFlAPs1aAiRB23","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugyb4BT7i0u4-0orVsZ4AaABAg.APqxtfu9_ioAPs8lkfrptz","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgwVS072D6qypRedA-h4AaABAg.APqx1bY8hL3APr1qBOMjA-","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgwD_B6lgWt6yUvsuQZ4AaABAg.APquC53NeYbAPrvOzD-DEb","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwdzGVKkxry1XoJSJx4AaABAg.APqtvuNNOvKAPrIeZunf3C","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyWZV4J9y7_8z6AQP54AaABAg.APqtZlUP4ANAPs1MC1_pRK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugx-j3UQyQL3NNUTF-l4AaABAg.APqtN_sWHsRAPqtdjKTDyF","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx-j3UQyQL3NNUTF-l4AaABAg.APqtN_sWHsRAPqtu74V7K-","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx-j3UQyQL3NNUTF-l4AaABAg.APqtN_sWHsRAPqtxkoL0xT","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]