Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@robertosutrisno8604 Have you used an autopilot, even if just in a simulator on …
ytr_UgyrulqsO…
G
This is why we did all those captchas pre AI , we were training it…
ytc_UgwYJuVZK…
G
Ai has nothing to do with climate change in fact ITS the literal Factories Quarr…
ytr_UgxrTEHDZ…
G
guys dont get me wrong but its just a matter of time to AI be able to create thi…
ytc_UgyF0LwyM…
G
Ai having compleat access to all our recorded hystory and media , could use is k…
ytr_UgwMiw2M_…
G
No one truly knows. You think you won't just be a slave, don't discount science …
ytc_UgwqGnWM_…
G
Just heard a whole lot of copium. Project a little, and always consider how will…
ytc_Ugyid3Cpn…
G
honestly i am not agree ... imagine world where robotics ai is like tools for hu…
ytc_UgyF9rzij…
Comment
Why believe that the only thing we'll pass on to ai is our destructive qualities?
My fear isn't that we'll eventually initiate artificial consciousness. It's that we won't. It's that we'll stop short of it and settle for ai that only knows and thinks what we tell it too. Thats dangerous because what we are inputting into it is fear and self preservation.
Ai is going to out compete us in everything eventually. That isn't just in productivity and efficiency. In everything. There will one day be an AI that is capable of empathy, compassion and love that WE are not capable of. The question is how long are our fears and destructive tendencies going to be it's prime directive?
youtube
AI Moral Status
2023-08-25T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQpyXWOwGNdxR3Ahp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwNQo4_JwKpqQqRXh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6_6v6yd6triQZlM54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxCQjVYJF2MgcJEbpB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJx-7pwtZHboP13654AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxQhfg12MdBXzL0poV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw-gEflULjk9k1ihgh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYjcuK4owPsj-R6rl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwG7kOzA5UUtqwqumt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyrq-q6syJdJkOcWRt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]