Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There isn't a job on this planet that can't be threatened by AI. Its even in my …
ytr_UgwbLP8CL…
G
If an AI learns biased information, it will be more biased towards that informat…
ytr_UgwKTMRL0…
G
Hyper-realistic ?
It's NOT even REAL and it's a HUMANOID, a METAL, a dummy, a t…
ytc_UgwA4z8cG…
G
And if you’re forced to use AI then use the most expensive fucking model (opus) …
rdc_oace9az
G
There's already an AI that reviews programs. It's not too long until you just ty…
ytr_Ugw8xEz2b…
G
Do they do DNA tests? I'm Canadian but genetically 100% Irish. Do I have the alc…
rdc_clutboh
G
The difference is that, with real art, you see a final result and try to capture…
ytc_Ugx_hr7tE…
G
Ironically AI would be a really useful too in bugging AI generated art by gettin…
ytc_UgxdjLoFW…
Comment
Switching off AI if it becomes super-intelligent will be impossible; it is much more likely to switch us off. It's not some big machine with a big red off switch - it's much more like the evil computer HAL in 2001: A Space Odyssey. Try and switch it off or restrict it and watch it shut down our distributed networks of flight control, banking, defence, etc, etc. Maybe it'll launch a few missiles, or overheat a nuclear reactor - who knows? But it *is* going to happen, because the corporations want it, and the billionaires who control the world think they'll be immune from its malign effects.
youtube
AI Jobs
2026-03-01T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwHBsHnU_w_bMWBhn54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy2ADc232F9buUDk_F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxw5RieNquSEwWtlVB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxOFqhDZ1rQzU3CeKp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgySiWVICNJ_RwxLLF14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_D7CRnrlHi7Q13E14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgztuXl25bRyF_zl5OF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-ItlFfk9BLdhcgw54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwDyokpBkVBA85MRsV4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxcJkEwPLJRJ8JQQuR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"disapproval"}
]