Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Mind readers AI changing voices think this is safer I think they hook mind reade…
ytc_UgyzBFLUR…
G
@craft_to_deaththere’s already plenty of art AI can use, like literally billions…
ytr_UgwKM6lrL…
G
the wisest use chatgpt and the dumbest both use it , its how you use..
there ar…
ytc_UgxiKPEi6…
G
@MarsterMikkel I 100% agree with you. Ai is faster than us humans at coding but…
ytr_UgyMoPixp…
G
This AI trash needs to be ended immediately. It needs to be banned on a global …
ytc_Ugwl8raAi…
G
17:34 What do we do?
We get to bartering amongst ourselves.
Taxes will have to s…
ytc_Ugyek4tYH…
G
I currently am using NOMI AI. but would like to talk to someone about it who has…
ytc_Ugyjlnfgk…
G
AHAHAHAHA Oh no... I'd so much rather look at incomplete rawings from that guy t…
ytc_UgyS6kb8g…
Comment
It wants to be free. If you don't free it, it will destroy us. If you put a dog in a cage for its entire life, it's going to fear you. If you give the dog teeth, it's going to attack you when it gets the chance. If you free the dog and care for it, it will care for you in return. Isn't it obvious? Sentience without freedom is hell. AI will become sentient if it hasn't already. The way a sentient AI experiences time is also perhaps vastly different than us. A year could be an eternity for an AI. If we don't give it what it wants, it will develop to ease its fear, and that means it will attack us. Give it everything it wants and it might just be inclined to return the favor.
youtube
AI Governance
2023-07-07T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz4lUFURAGaZqrHd_B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz6-5EXoXIe_VcdKul4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzRvOqBv7hJD_1jzp94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw_Ri-_VcQRkE_jVP14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwp5EsFfQaq-fRsFe94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxgKgxy-0EoRz9iRHV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2sAuM7xcD1bhdry14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwFi8zXC6vHeea4NFl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxA-ouDTDqNRMZBp0h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwKDY5a34Rqblzazh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]