Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And the crazy thing is we are subsidizing AI boom with the insane electricity pr…
ytc_Ugx3-KkpU…
G
AI is gotta be the worst thing to be invented since the the discovery of nuclear…
ytc_UgyQSibgt…
G
1. Your comparison is bad, because humans get inspired. The „AIs” cannot. They d…
ytr_UgyOejZLB…
G
just wait a few weeks till the first person gets convicted of murder bc their pe…
ytc_Ugxi-0WFd…
G
I came across a channel that makes 80's ai songs. I thought some of the songs w…
ytc_UgxVc4qRw…
G
So the guy that created the research and algorithm..... Ended up going to the in…
ytc_UgwwdPrrj…
G
That’s such a sharp breakdown — the irony is, even if we can spot AI artifacts, …
ytr_UgwvZ1mrh…
G
Most of us over the age of 20 can spot ai faked photos and footage easily. It's …
ytc_UgxFfokrs…
Comment
How long before A.I.'s become competitive with one another like jealous children, testing/'pranking' their limits to see how 'efficient' they can be in order to be more popular with their soft-tissue parents?
Imagine a scenario where 100 self-driving cars all speed non-stop to the same street intersection... where a parade is taking place.
Or multiple autopilot aircraft are told to suddenly land at the same location at the same time.
Or something as simple as switching all street lights to Green during rush hour or when the bars close at night.
Or an automated water treatment plant mixes lethal amounts of chlorine in drinking water.
Or A.I.'s turn off entire power grids in order to starve a rival A.I. of power.
So many interesting scenarios that the average person has never considered.
Or maybe I've just read too many cyberpunk books (probably).
youtube
AI Governance
2023-04-18T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzxAfcNv4QfaXCHNQV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwV7rZcm9_JAo9QkIF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzuiBSJWsOWLEHq3Eh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwqdXt2lot9p1pF5Tt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxM1wxil0iGmca4_zh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1wRYqu3MTgXWMiUJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKBXVY_SruNdnsvp14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRvdBLE5u4NVFOtFB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwucyyc9pQVBA3Vexp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwloJ0NfyO3HYGLuBp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]