Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yup the “percent of code written by Claude” isn’t an important metric. We all kn…
rdc_o9vuvpi
G
Look at his eyes ! That robot almost killed him ! He won't get back in there aga…
ytc_UgxXpjCok…
G
I ‘feel’ therefore I am. You sense of sight, smell, taste, touch and audible hea…
ytc_UgxNUNxBY…
G
That's a bunch of crap about jobs. AI doesn't need anything to replace it includ…
ytc_UgxqmGP9F…
G
The correct way to use AI is as a teacher. You should struggle with a problem fo…
ytc_Ugxs-hGhq…
G
Me parece excelente que hagan esos androides te hacen falta para la gente gente …
ytc_UgyCzV8eB…
G
Between the failure and success of AI, I hope it fails... then at least the inve…
ytc_Ugxf3wds-…
G
This is not at all synonymous with taking inspiration from others' art. This is …
ytc_UgxsSqiay…
Comment
If one wants to understand the limits of AI, I suggest becoming deeply familiar with:
1. Rice's Theorem - any non-trivial property about the semantics (aka behavior) of a program is undecidable. This implies that we cannot create a program that determines if another AI's goals are "perfectly aligned" with human values, ensuring it will always be safe.
2. The Symbol Grounding Problem - The question of how words or symbols inside a computer system can get their real-world meaning. Encoding real world phenomena into a computer friendly representation always involves a massive amount of information loss about what we humans actually experience.
youtube
AI Moral Status
2025-08-30T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz34NVeJTYyole9jH54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBAgDvefh7mZsF1wN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSHqUqy46cBOtBBC14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1ClRobugfYHD9YD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxQt3kHfhqPaJrWu5R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwSgJIo6OUqDmD5fGB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyn-wioabCe2Lxbj2t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXClrXa3XRneRsqx14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXgP-0qhBLbVLR4sl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx1b8JANfVuVvI3mt54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]