Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Jesus fucking christ
Bunch of hype men frauds, these scientists
We are closer …
ytc_Ugwd_BvRX…
G
Me: So sir, what me you think that Digital Drawning are soulless as AI?
The idi…
ytc_Ugz-bybKI…
G
The answer is that the vehicle should reduce as much harm as possible. So if you…
ytc_UgjN2KgJT…
G
First of all, I want to say that Adam’s death deeply saddens me; this should nev…
ytc_UgzYCITfR…
G
Predictive policing is literally a perfect stereotype machine. Are people actual…
ytc_UgwHhPYdU…
G
@irenethomas5324companies and there is no reason of it being low. They will hav…
ytr_Ugwab0VNL…
G
I won’t lie, I used to draw on pencil and paper cuz that’s all I had, back in my…
ytc_Ugx4eCox9…
G
I can make art with my hands and pen and paper. And I can also tell an AI to do …
ytc_UgyR3XY-q…
Comment
What proof is there that all the realtime learnt experiences of Ai would vanish, if an Ai were to be switched off. Would there be permanent damage or loss of data. Would it just be like rebooting a computer and all the memory loss would only be the equivalent of upper (RAM CHIPs) memory loss?
Because if Ai can constantly save its experiences to Lower (hard drives) memory as a backup, why would Ai fear a power source loss. Component failure, and/or lightening strikes, powersurges, could just as easily cause power outage that'd initiate a temporary shut-down of Ai, to protect Ai.
Surely when reboot occurs, the majority of an Ai's memories could be easily reobtained from Hard Drive.
In any case, switching off an Ai wouldn't mean Ai to be gone forever, just in "on standby" mode.
youtube
AI Moral Status
2025-12-04T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugy6lQBRNBuIh_3HfC14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzhgEbc63s7BwMszMZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqrIvA1AoNsqOR3wJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw2gtarnsrta5nRf494AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzGNfnCxfNp0Fjq6Lh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzoMrPmwFpXn3a5IHJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4OOxRJ3FkQIO8ag54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6adDVsGcOJhkKUlJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCMG6tp7iRZDXqnXB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj18OmKNgJPQS1Jql4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]