Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This feels like an argument you'd get into with an ex. Chatgpt is such a gasligh…
ytc_UgyV06-C2…
G
Not to mention that AI art can kill people if they are using some AI-art-generat…
ytc_Ugy5G2Qo9…
G
I agree with 99% of the stuff Charlie talks about. But this time he’s just wrong…
ytc_UgwmVpCTK…
G
@Deep_Side_Sleep sure but praising policies without being specific is like want…
ytr_Ugz8MDSNM…
G
LMFAO that's because you bring nothing to the table. You suck. Jus tbe honest. E…
ytr_Ugx9IKo7P…
G
An AI does nothing until you ask it a question. There is no processing going on …
ytc_UgwtAB3a2…
G
The AI LLM provides a link to the sources of the information it produces, and th…
ytc_UgxHZ_HgR…
G
He is the co-founder of OpenAI, he left exactly because of the concern on AI sin…
ytr_Ugw3Z8-Q7…
Comment
He is so NOT concerned about safety and this is an old video before he was fired then rehired and the Open AI board who were concerned (operating under Nonprofit status) was replaced and now the majority of founders/ Exec team have departed due to these concerns not being taken seriously enough nor the boards new plans to end their operations as an nonprofit and pursue full monopolization of AI tech for Microsoft’s own interests and goals to dominate the market at expense of humanity
youtube
AI Governance
2024-10-16T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxIl_-3hBIBFkGfXSN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwlu7Xh6Zoj1AlHFit4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFLkjFWDVrf1XxfDB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwRVtuphbt4guTvLrx4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxKEaMltW6dMwtEmvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyy2zFIDv9-O12zcKF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYdGLoFOgbdNM_1M94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwCuW3fwZm-TLpRkQd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyHKpyW7ZhwFBbmK5V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxoijAO6nMRHRs2Bbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]