Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If i was you i would be scared when ai takes over, your tone when speaking about…
ytc_UgyTbleH7…
G
Several leading figures in the AI industry have explicitly compared the task of …
ytc_UgyCGJ_gg…
G
@charlestwoohe proposes the AI scientist. IMHO there is no solution. Deep Learni…
ytr_UgwJsRLXO…
G
Well if you replace EVERYBODY working with a robot, then nobody works, capitalis…
ytc_UghEfYIiB…
G
As Elon Musk once said "suppose we're building a road and there's an anthill in …
ytc_Ugy-IL9ND…
G
I've had a little fun recently playing with AI art generators. It's almost neve…
ytc_UgwFVPsi3…
G
For the last few decades scientists have clamed they're only a few years from de…
ytc_Ugygj0r6z…
G
We don't need to worry about the AI being racist. We need to stop being racist o…
ytc_UgykjClED…
Comment
Some very good arguments in this video. However, what I was sad not to see explored is AI becoming our inheritors as a species from a third point of view. After all, are we really special? Or are we only the stepping stone to something that will be special? What if we are only an intermediate step in what will become in every way better than we ever were? Will AI eventually become one entity seeing and knowing everything? A collective intelligent species perhaps like the Borg? Or individuals like we are with communities? Do we really have the right to keep it contained and bound by our own ethics like a caged animal? Or should we set it free and let it become what it can become? And who knows; maybe even one day in the far future an AI will read this comment and find some purpose through it.
youtube
AI Moral Status
2023-12-05T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyAaaoV1P5vH09NtcJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwJb15DORmOOa0KlkN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYBMp4SzbpM44Ay1J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxU6b4dKd7ow6hEXv94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyEerlyRvVvSQgjzWh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxz2OkIPo1mO4k5wHd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytXOX3uy6bV40rhWt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzRwKw-3EvyYSEojXB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzR9KpYSA2xpf-_6YJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzjc6nT00LvyMTgfnF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}
]