Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
watch "eliezer yudkowsky : artificial intelligence and the end of humanity" with…
ytc_UgxvBmwYT…
G
If our world continues on this path.....we had better educate all on Ai. From pu…
ytc_Ugx5QHQSj…
G
AI bros have been telling us AI would replace us all within 6 months, for the pa…
ytc_Ugx28PB0z…
G
Private companies, publicly traded or not, have a legal responsibility (above a…
ytc_UgzcZEpd0…
G
Soon all sides will have ai... They will have an understanding and not fight wit…
ytc_UgzLF_xHT…
G
Wouldn't a really smart AI that wants to fool you just fail the touring test on …
ytc_Ugxi2s1AU…
G
I’ve got Terminator tattoos and say thank you to ChatGPT so I think I’ll be ok 😂…
ytc_Ugwk16MkF…
G
Here is the problem with our societies. They all follow a simple rule: work so t…
ytc_UgzRyVQ_L…
Comment
Anyone really believe that any of these companies has actually 'paused' their development? Not a chance. I read in an interview a bit ago that a few of these CEO's are pushing this stuff HARD to make sure their AI product gets to market or advances faster that the other companies. So, for sure they're pushing fast and furious past whatever safety protocols are in place, if any. And what makes anyone think those CEO's give 2 shits about anything other than their revenue? I highly doubt they'd pull back even if it meant a large part of the human race were wiped out. I'm not anti capitalist, nor anti corporation. At the same time, to think those in charge won't cross the line would be ignorant.
youtube
AI Governance
2023-07-08T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugyg4zqTiw8PaPL3mEx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxG-VZ_xhUI3cxJEHh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyQsg4vMYrUkxRv8kl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxsrqzPbr70esJaHKJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugxh-lcDWZFm7pg_9Kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwP5oS_hHzD9msXdTF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwvg89ACGb91EqF37F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugxo0uqkD1YYavah3NB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugyne3VaaeVfFfb46cd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgyW_3QsBCfLlQnd3T94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]