Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh don't worry, you don't just have an ego. You have a massively delusional ego …
ytr_Ugx_2Ttyp…
G
People needing rides (all of us) should learn to BOYCOTT driverless cars.
You ca…
ytc_UgzJ8S851…
G
AGI is very likely NOT around the corner... the trick (for survival?) is to conv…
ytc_UgwzuARjb…
G
Okay, let's say you complain about your problems somewhere online, and an author…
ytc_UgwhYWJwH…
G
[continuation see before]
-GEMINI:
"This is the climax. The loop is closed.…
ytc_UgxW5yl0k…
G
I tried it, it was all crap. Worse, you stop learning anything at that point, so…
ytr_Ugx4TDLQV…
G
The reality is that AI is replacing teachers and students. In the coming decades…
ytc_Ugzcu0uE8…
G
What happened with the AI is likely that its internal training data was only up …
ytc_UgyXt143N…
Comment
Turning off an AI which I'm assuming would be conscious would be just like killing and here's why:
Assuming that it's conscious, if you unplug it, it would most certainly mean that the AI would have to be rebooted back on which also means, logically, when the AI is on, it's consciousness would be stored on it's RAM since it's the fastest memory we know of and it would be the logical thing the engineers would, while the AI is on, it would probably make backups on it's hard drives, now if you turn it off, it would mean that you're killing it and once you turn it back on, it might hold the integrity of the data but it would just be a copy of the original AI, just like the concept of transcendence or teleportation
youtube
AI Moral Status
2017-02-26T15:2…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UggCSf-D4yNp_HgCoAEC.8PQhxIscddF8PVUKCi7irJ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgjvrNJmcGrom3gCoAEC.8PQgdjoChUT8P_Szz3s4oh","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgjvrNJmcGrom3gCoAEC.8PQgdjoChUT8PeuwYIDb9g","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_Ughnbqf7FdWNLXgCoAEC.8PQW6ihZpL08PSOiNeu01o","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugh_lhwycoYkl3gCoAEC.8PPsd-c90B18PSLd_ACNa2","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugh_lhwycoYkl3gCoAEC.8PPsd-c90B18PUqrE1M6ye","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgilZ3tcw9JQs3gCoAEC.8PPh6kzxpZ_8PSih5jKMi9","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgiERnybnLx8aXgCoAEC.8PPXxchMEgY8PTgXQTPHCm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgiERnybnLx8aXgCoAEC.8PPXxchMEgY8PWvWZHgmdQ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugh-J8uaNjnVF3gCoAEC.8PPWNn-8_xD8PSDGkIVNjR","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]