Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Owiecowie you use ai to create something from your imagination, art itself need…
ytr_UgyHE-3wg…
G
The savings from AI won’t be coming to regular working people in the form of mor…
ytc_UgzWyYlup…
G
If all they want is to be human and mortal then weve won that game, what do we n…
ytc_UgxpZl5Wa…
G
His comment - "Very FEW people are making decisions" - regarding the creation of…
ytc_UgzDdzmcO…
G
Nope.
What really drives AI and psychology is absolutely disgusting role model o…
ytr_Ugzz4HwLV…
G
Why do all these AI robots have such creepy faces? The faces of Living Dolls are…
ytc_UgxwBtfI2…
G
As a disabled artist disrespectfully those saying AI art is "inclusive" can fuck…
ytc_UgxGWTNY8…
G
>being cruel to AI
lmao That's literally every time they interact with AI. They …
ytc_UgyYh0sPG…
Comment
Really good conversation here, Bostrom and Greene. I noticed that there was a component about uploading brains into a silicon substrating, this is not physically possible. Mapping a human connectome would require 1 zetaflop of processing speed. In the future these power requirements might come down, but you won't get more efficient than the actual representation. You mentioned a lack of constraints and there are 2 I'm going to point out here. What is the point of representing someone's brain in hardware when the physical system is going to be up to a data centre at current capabilities for a single individual? The second, and this is really important, we aren't getting past natural constraints, ever. We currently have a 400-500 year supply of helium. That is required for sub 7nm scale architecture that runs AI today. There are very real, real world constraints on these systems.
youtube
AI Moral Status
2026-04-18T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyJECAjA4cKDle7yi14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzTB2lRtltj9JIchvJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxE7oq5bwGBAJKL0hl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgybLro555urp6mFjXZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx7hPb9NyZLEabEWFV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgymoSTSivP3Yvqzuwp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzxGV09VIdCeoM_S1t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz0zpDzqxoW3D6bxVV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwV3kI42R6NKOwtDJV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgymQV48gf6405jxcah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]