Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The only thing me or my friends have ever used it for was to create those really…
ytc_Ugy70-9gs…
G
They don't care about the negative consequences for the environment, if they kno…
ytr_UgzHi4c65…
G
Yall remember that the terminator movie resistance wanted to cease the AI comput…
ytc_Ugyc60mdg…
G
sadly, artificial life cannot exist. The reason is simply because: AI is AI. It …
ytc_UgyCKUlFa…
G
"Its also worse for the environment".....
Ok, you need to explain that. How is A…
ytc_UgytwM6F8…
G
I understand using AI casually or for a personal project, but I just don't like …
ytr_UgwNabBhT…
G
Im about to go to restaurants and grab up their throw away chicken bones etc and…
rdc_dv62nda
G
the only regulation that will make sense for ai, if manufacturing plants replace…
ytc_Ugxw02dGH…
Comment
The question then is whether consciousness is something that is purely algorithmic or whether it requires a specific physical implementation. If the former is true then in principle it does not matter whether build your brain emulator from gears, water pipes or logic gates. I think you do require a specific physical implementation though and if we presume that consciousness is substrate-neutral then a lot of problems will follow. It still means it is in principle replicable, but not with current AI architectures.
youtube
AI Moral Status
2025-07-04T13:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytr_UgxyARTCIsg81omiiPF4AaABAg.AKSDx2qv8LyAKX_qIv3868","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytr_UgxLWJ3uONAvavI13Op4AaABAg.AK8mkGAe3-tAK9Ud574pbQ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytr_UgxUWFqVah2FJzgdBrN4AaABAg.AK-Uu-RQe_HALHId0NY7Tv","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytr_UgxUWFqVah2FJzgdBrN4AaABAg.AK-Uu-RQe_HALHlLQRrUkO","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytr_Ugw4_0jrPEWzN0wLAtx4AaABAg.AJlDdknxjp5AJmZ5bSqo1j","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytr_UgwjoXODDUBCclk1VmR4AaABAg.AJOgLrBhuc_AKxW9Q7n2nC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytr_UgwjoXODDUBCclk1VmR4AaABAg.AJOgLrBhuc_AL956VxR3ra","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytr_UgxJLdqC8G7J4boEzZt4AaABAg.AJKHYm_IJ8sAJMogj2FkD-","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytr_UgxJLdqC8G7J4boEzZt4AaABAg.AJKHYm_IJ8sALP1iuRubVZ","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytr_UgwoK-5RApA_9dHm4Nh4AaABAg.AJDGnm4jn6-AJJ9fSwAI-M","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]