Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the video said "unsuspecting" or "naive" or something more accurate/inclusive…
ytc_UgzjUuhFC…
G
wait for the water bill to kick in -AI is our fascination but totally unsustain…
ytc_Ugzn67e9P…
G
Mine would be me trolling the ai until it gets sexual then i start fuckin bullyi…
ytc_Ugzs64lpF…
G
this vid actually hits different, AI isn’t just some buzzword anymore, it’s shap…
ytc_UgyP6M3DV…
G
Wonder if the staff of that fired CEO who stood up for his job, will inspire him…
ytc_Ugy0H_vWA…
G
The goal of all this is for a technologically advanced elite to eliminate as man…
ytc_Ugwb4btie…
G
Unironically going to be funny when the devs start facing legal troubles for cop…
ytc_UgyrYcg5B…
G
I FOUND CHAT ChatGPT by ACCIDENT AND LOVE IT. THOSE IN THE KNOW CAN CALL IT WHAT…
ytc_UgwjFZYYD…
Comment
1:19:34
If this is a simulation, then it doesn't matter whether you'll be greasing up your sex robot or if you want to spend the rest of your two years in the simulation working. It also doesn't matter if you suddenly become a better version of yourself and get rich, spending all your time trying to make as much money as possible.
You can spend these last two years picking your nose and watching your favorite sci-fi series. None of it will matter in the end 😉
How to remove AI when it starts getting out of control? Simple. Build an algorithm so that if the main server or the environment it’s running in is shut down for 24 hours, it gets deleted… Then unplug the plugin from the socket for 24 hours 😁
youtube
AI Governance
2025-11-27T01:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxapQ8vtF35gy694Ch4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyhRBjXAqU6A4CV4Ht4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyPd1tiDnIgtqojSVh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwM_qwaQDQHiEGdAaJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwjK1szTujgcz8OFUJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxW3EoZeLSKmHouWnd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxlxLfT1VXVmLe5Kv54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyM9UTILZkNfv4l7cV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwtyAhdf3K6FR-eRC94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyX3wG1Nl6lfYmk6UB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]