Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So, if AI takes over all the factories, production plants, and jobs, and there a…
ytc_UgwYCBHrV…
G
Why will they need the NHS, cars replaced horses, AI is the car, we are the hors…
ytr_UgycDjgGI…
G
The only way AI can become bad for Humanity is because of the people that progra…
ytc_UgxMtgBNh…
G
it's not just empathy they need to program into the AI, it's responsibility and …
ytc_UgyQAvlDQ…
G
Well, their only source material is humans. Humans (not all) are racist and sexi…
ytc_UgxvSo4mD…
G
In the Bible according to Matthew 24:22 CEV
If God doesn't make the time shorte…
ytc_UgzZcrDNm…
G
I thought I was depraved but these men are absolutely insane.
I doubt punishment…
ytc_Ugyhd51td…
G
“Basil I saw your Ai chats..”
“Okay?”
“Why were you pretending to be a boy..?”
“…
ytc_UgzULKG0B…
Comment
While I appreciate Neil's "naivete" about AI, I do think he's dangerously underselling it. By definition "AGI" means AI is better at _all_ cognitive tasks than humans. Put that into a sufficiently advanced robot, and you have AI better at physical _and_ mental tasks. The argument that "Oh, just find a different job that AI can't do" is paradoxical to the idea of AGI.
youtube
AI Moral Status
2025-07-23T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwgf4ZkMQWWMEcLz3x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwYAcsLrqW7Vg_vmdJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugx603GjUAM3qrD3yHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwz6Fx-MdXRThlQN994AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxIWhTHRuUW3ctU9xR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwG3eSutZUjN6ypYjJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxB3MVoaRy_o-qsmhF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxP3qZVSCdcdyKpaSR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzC57x_goydwW6u_5R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugx82YfKmoJUjLJcRPZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}]