Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ic3yfr0ggy51 no ones mind is being blown by someone writing a sentence for an …
ytr_UgwmuX-SQ…
G
The narrative of imagining adversarial like Russia, China and Iran doing awful t…
ytc_Ugy68hEqX…
G
Personally AI isn't art as usual and that making art is possible without putting…
ytc_UgxzVYOge…
G
Each one of these robots replaces a human being who needs a job to feed his fami…
ytc_Ugwc4Lwv-…
G
Humans; we don't want to work our whole life!
Also human: we don't want AI or r…
ytc_Ugxy7cHxA…
G
It's important to remember that with any new technology, there are potential ris…
ytr_UgwTnKf-o…
G
This AI wave will be short-lived... AI dont pay taxes.. you need taxes from h…
ytc_UgxYB5YNQ…
G
No cause the ai was never the issue. Notice how the 14 yo girl knew to stop inte…
ytr_UgxfnA4E0…
Comment
@tjen7929 Considering the amount of knowledge and thinking power AI has, I don't see any reason why it couldn't become sentient over time. We already see language models capable of discussing with each other with reasoning and logic, and it's still super early. Humans are *massively* flawed in that we need air, food, water, sleep, can only live on earth, and we get to only 70 or 80 years old and then just... die. We're also incredibly stupid, we don't even know why we're here or what's out there in space, and we will never know because of those limitations. Personally I think humans developing AI that is superior and doesn't suffer these limitations, that can continue life beyond us into space will be humans greatest contribution. If you think of it on a long enough timeline it doesn't seem like humans are equipped to last beyond earth.
youtube
AI Governance
2023-04-18T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugw_ZmgXgBrfzAiRFwR4AaABAg.9odGzFJyXKn9odTf9p2N8S","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwMAUAKOGqPjkqchWV4AaABAg.9odFZ21oXCD9odT-_0P3ke","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyfIOcfg7taWn8Mw-x4AaABAg.9odA5SKjPYg9odQYcUAGKU","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgyfIOcfg7taWn8Mw-x4AaABAg.9odA5SKjPYg9odUfmmD_RU","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugzyj51Yy96YXMIAW-V4AaABAg.9odA3nmhtUa9odBGa2DZLk","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzQUPOodNBZTN9DLdp4AaABAg.9od6C5GlhJs9od7MoWgiCY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9odHE09IhkJ","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9odNR650exH","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9oe0I17T-TR","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugxkn_1UZO6KXJp7KDx4AaABAg.9od4reZ8kKM9oey1bYYGGv","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]