Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
An Accomplished African American Grandmother Confronts Artificial Intelligence E…
ytc_UgwTGKLac…
G
The sad thing, if this was real, I would give it 10 years for robot corn and abu…
ytc_UgxsGHDQ6…
G
Is funny how many accidents accur every day from people driving and sometimes pe…
ytc_Ugxn77UFv…
G
And tesla is not even the best. Waymo (Google) actually drive the car without e…
ytr_UgxT0BANV…
G
It’s actually so annoying how so many people don’t understand the importance of …
ytc_UgwD9HJjL…
G
On the Question of if an AI can use copyrighted works and the similarity with ar…
ytc_UgyLzqGZg…
G
AI gives u a code that works but it is not a code ready for production it is mor…
ytc_Ugw-GkZFX…
G
It's software that simulates a neural network, which is a mathematical construct…
ytr_UgzoykPsR…
Comment
Then let's add simulation theory. Whether it's base reality or a simulation ran millions of times, maybe the other sims in our universe still don't want it all to end. AI super intelligence might trigger an automatic kill switch. Think about it. Once it's here there's no limit and possibly it rides the levels all the way to the program running our whole universe and all others. Absolute kill code potential.
youtube
AI Governance
2025-09-30T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzpsbGOTuaqLy_FVHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwx_ijuGP8IFsg_vLx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxE7zGPba8gCF62oHh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzP3jv6YJ5jpCS4nsl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyXTH9c-xjSAT-NESB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxiiqZ4rbxTA33OVNB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz41kDRGqE03CaXDb94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwhqY8FhELfTzri3PJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwDwJcZwabPJNq7gfx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxWyw8RiF3qQ_QxgS94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]