Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
5:20 well, apparently there's some lawsuit.or something on (suno?) because they …
ytc_UgxBevmRh…
G
AI steals, but hey guess corpos like you more. This is to protect art from getti…
ytr_UgyY9ZB55…
G
That's funny I didn't blink either. Does that mean I'm a deep fake too? 😂…
ytc_UgwcWVFJO…
G
@arool4017no I don't think giving a factory assembly robot arm or a clunky robo…
ytr_UgwwhcMj4…
G
ai best quality is the ability too mimic human ideas. if it ever comes up with a…
ytc_UgzBWWNVV…
G
All of these students using AI to write everything would fail an exam because th…
ytc_UgypfEeqY…
G
The same people who hate the looks of this AI animation praise Pablo Picasso and…
ytc_Ugzjg-wj-…
G
if companies are replacing human power with AI, how are these companies going to…
ytc_UgxpxXhOw…
Comment
Dangerous AI doesnt even need to be sentient. See the paperclip maximiser from Isaac Arthur. In essence, make paperclips out of everything. If non sentient gain sentience to maximise paperclip production. Or not, instead start a paperclip cult among humans to soften opposition. All will be paperclips. The only thing such a counciousness(?) will regred is that at the end of it all it cant convert the last bit of its factories into paperclips, as it has strategically canibalised even itself for maximum production.
Such an alien mind could be even more ailien than actial aliens, as at least we share similar evolution with such hypothetical beings. And THAT scares me about AI.
youtube
AI Moral Status
2023-11-03T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzE1c5ofxynvnyLyFh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgwnnmGcGxZyzFqOXgZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwlZx7TJKcBZIh_1F14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugz5EbtR2-fgTZJRcP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwclBtt66cS8BEUwzV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgwOxhpWA7i-pC5SPp54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgweFSf2gd1BABEr2Gt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugws-hTKST4ANFG_za14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwMW2b7rwsQY5gttsN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzQfomn-aTY-pElq3F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}]