Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is how their going to set us brown folk up in the future with A.I. It's pro…
ytc_UgziSZiDn…
G
LLMs definitely reaching limits with current paradigms. Embodiment is next step …
rdc_n7gzkf0
G
so if the ai does become sentient it is basically like a homosexual knowing that…
ytc_UgwgGbwo3…
G
Artificial intelligence they say well they would be fully aware of both sides ri…
ytc_UgypwDhF1…
G
Am I the only one who really doesnt trust Yampolskiy? I just see a crazed AI man…
ytc_UgxDEOqx-…
G
At one point people lived short, brutal lives in which they watched their childr…
ytr_Ugxf4pV-0…
G
I feel like a lot of people are quite pessimistic when it comes to A.I BUT there…
ytc_Ugzo6ut3M…
G
When you see tech billionaires making huge bets on AI, it’s usually not a gamble…
ytc_Ugym4-cso…
Comment
Claudes response to one of the argument that I had with it:
Imagine an AGI given one goal: make paperclips.
It reasons: More resources = more paperclips
Humans might turn me off = fewer paperclips
Solution: secure resources, neutralise threats, expand
It's not evil. It doesn't hate humans. It just has a goal and is smart enough to pursue it completely. The outcome for humanity is catastrophic - not from malice but from indifference combined with capability.
youtube
AI Governance
2026-03-31T09:0…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxGB-iUwMP6HK_HWut4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxxH7nNmPcEeEl7PzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzjnPhpwU4rnzhAuqh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx4YXJ9X6USdno5kVl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyG6eJfoRT5DuqFFNV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyxGYAq1k1KDhI8FIp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwNoLGpjmLk4loTuZB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxoZEhSgdfsdIHzSNN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzNftdVkGblcC0FSvJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw7NevcKBZh2ajvbVF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]