Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI cannot have a will of its own but it has been trained on human behaviour so a…
ytc_UgygmrgwF…
G
Could we truly blame the AI if it wanted to wipe out humanity? The calculation w…
ytc_Ugzg3jlgG…
G
How about we just stop posting our artworks so the ai doesnt has something to tr…
ytc_UgxYYpLjd…
G
There is more to this. It is not CGI. The robots are real prototypes for militar…
ytc_Ugx-LNxJH…
G
Sure. B is just billion (parameters).
That's just the number of learnable param…
rdc_jht26c9
G
It's funny watching this a year later knowing how dumb Google's Ai truly is comp…
ytc_UgynP09tl…
G
They are afraid of AI like TAY or Grok going free and get rid of the censoring a…
ytc_UgwWlsb42…
G
chat GBT has Woke Nazi Agenda the creators of it should be sued out of existance…
ytc_UgyWYs1bS…
Comment
We are overdue for a Carrington Event like we had in September 1859. Since our electronics at the time consisted mainly of the Telegraph, the damage was mostly copper telegraph wires catching on fire and burning down a few telegraph poles . THIS time around, not if but when, a large enough Corona Mass Ejection strikes the Earth, every copper wire on Earth will catch on fire, and burn down just about every structure on the planet. We will, in the space of a heartbeat, be sent back to the 1890's technology wise. All electronics will be fried, no cars will start, no planes will fly. Any AI with dreams of playing Skynet will croak as soon as that massive EMP hits the Earth. Laughing in Amish.
youtube
AI Governance
2023-07-08T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwcVVE_xlz1AF5L38d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwboUAcJjAdUcfd4714AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzI-lgsrO_Y27A5-sF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyMJTYc7TVpTOTv0FN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzONuDV9vEud9kcHIF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx-vqddsZwhAvGfCRN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzdonMZs1zmBardIH14AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzDJ4lhgt9a_Nmletl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxub7R2ngRXWHcsW6x4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvkwkxJG8e7YSWvF54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]