Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It was a maistake to make Ai public beyond text generation. Everything else shou…
ytc_UgxKA01FG…
G
The reason not to stop AI research isn't just money. It's also all the positives…
ytc_Ugxll8Gz7…
G
No matter what policies you have, whether it's with UBI or tax benefits, the fac…
ytc_UgxLHt8Wz…
G
Fun fact! We dont actually know how ai works! It just sorta does. (We do know ho…
ytc_Ugxb_VwLJ…
G
29:11
'It's giving up!" Lolololol
I wouldnt dare sit in any of these self-drivi…
ytc_UgyRcQH_2…
G
we are at the beginning of a war for our art and soul against the use of a.i in …
ytc_UgypbKULU…
G
I like AI art because I don’t have to see artists on X anymore.
Every time they …
ytc_UgzKPiJj5…
G
If this was ever real in the future they should give the human fighters brass kn…
ytc_Ugz9jxtdc…
Comment
Clearly future war has to be by using intelligence Than physical. Using different products controlled by central products with certain autonomy. If we are smart enough positively there won’t be any human death before a war would be decided.
We are still making AI robots like ourselves and both physically and functionings. They are made to communicate like human ( but is it necessary???)many animals communicate,I feel better/differently ( can these robots made to communicate with animals or plants and we can understand them???but can robots collaborate with animals and plants to do same things we do to animals/plats???).
Can robots receive and understand evidences of past and future ???
And many more... we might have lots of information coming from extra/intra terrestrial informations which we can’t decode.
youtube
AI Moral Status
2021-02-24T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5tHab2-83oGPjgnF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyyGf6X6_eBl6o73px4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwRBk8K_JUgBKxUWl54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgyCQaI0IxHmfTbkLDx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxA_4w0C2cBJFPNFM54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3av42gVhMrrz6JBN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyJ9FnMC5A1uhW_--R4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw8sD4r0DoKcic12454AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzYfIxvWFZIXNNxGud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzj2h-seD4ey7FO7Ox4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]