Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the AI only will create similar art from one artist only if you indicate it to d…
ytc_UgwkUR5DP…
G
Oh, my goodness! I dont like it! It scares me! They kill the Human Race!…
ytc_UgzI3vAuI…
G
Cool chatgpt can actually remember order numbers, huge improvement over the cust…
ytc_UgwpRVRB9…
G
'A man was subject to the inevitable AI human purge. this is why he's now just a…
ytc_Ugz0HKlbZ…
G
It’s really hard to be a woman already. I can’t imagine being a female content c…
ytc_Ugzb4ct0K…
G
Why are they moving like their in slow mo? And why is it so…smuggy? Why do the c…
ytc_Ugzu5Au3M…
G
@DoctorBones1 yes I want to gatekeep art, i have hard work, i have talent, i mak…
ytr_UgwdTdr8Y…
G
I have yet to find anything GPT 5 is good for except AI slop. I'm including Sora…
ytc_UgyHgyebi…
Comment
Most comments here suggest Yudkowsky has won in this debate. I, however, see numerous instances of Yudkowsky not even grasping the deeper point Wolfram makes. He's very good at analogies and constructing a self coherent scene, but he misses critical conclusions/intuitions that have greater gravity. An example is missing the point that once you have a million-fold increase in computation, your "day" becomes mere miliseconds you see in full fidelity and you would no longer even think of space as space, but instead as millions of particle interactions occuring around you. A mere aggregation to a day's worth of events would be similar to hundreds of years of human life.
Wolfram's deeper point is that AI doomers are vastly over-anthropomorphizing what an exponentially grown super intelligent AI's thinking framework will be like.
youtube
AI Governance
2024-11-19T20:4…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx6qIp5l_aI9AfElqp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy8QpWKbxQ9n7H_aAx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx7QO9apJwSwFJJDFd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy_7LlslQK-jD28V2J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy0yNh2c8WMFRwdpqJ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyEHcmpJN4NHiXUwF14AaABAg","responsibility":"user","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugymnp8X3WB_5uO6CpF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwcGbsvaLER3kwY4m54AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx2VcfgXyVMdW6L8P94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw5nG423IJW_SdQh0V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]