Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Robot can never destroy humans. Bcuz a robot is made humans ! THE CREATOR CAN NE…
ytc_UgzXPYQDh…
G
Up to the point she thinks computer cooling system consumes water. Your car’s co…
ytr_UgzMpe90K…
G
@BTS_ARMY 💜 yes and no. As you can see, if you've read and understood my comment…
ytr_UgwMxX2_d…
G
This is just a way to scared people to get views 🤣 Robotics is a very small part…
ytc_Ugx_aSZ7S…
G
I don't want a slave robot, is this ok? I got arm to take my groceries by myself…
ytc_UggY2x10D…
G
AI will be humanitys end because of greed oil gas....etc.Everyone will very so f…
ytc_UgzBFkk10…
G
Yep... AI will always produce bad art. Probably. But children being born now wil…
ytc_UgxBsZFIm…
G
Ai is a pretty good thing tbh cuz' it can help us alot like artist can get an id…
ytc_Ugyimxbrq…
Comment
I'm not saying I believe AI will or could stop human ageing. But it is no less speculative than the idea that it will destroy humanity.
How are we supposed to evaluate the probability of this and a hundred other possible outcomes? Yes we would like to figure it all out right now. But we can't. We will have to wait until we at least get primitive versions of technology that could potentially achieve such things. Then we can have an intelligent debate based on actual facts. But no such technology is on the horizon right now.
youtube
AI Governance
2023-07-17T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytr_UgzZBZa5vsqXpN2YZ2t4AaABAg.9rsOXTlFMvX9sHGTRNB36K","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytr_UgyP32EFA3Y5ktq3NCR4AaABAg.9rq9WbI78bQ9rsKHLjQ-rd","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytr_UgyP32EFA3Y5ktq3NCR4AaABAg.9rq9WbI78bQ9rsnnSxiPBn","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},{"id":"ytr_Ugx9bhtLneJ2aN4J9xl4AaABAg.9rpjteLMIMZ9t1-RcsIlgQ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytr_Ugx5LT0M-B6vvyirP9Z4AaABAg.9rohS4EIjnX9s0zPcrwSe2","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytr_UgzhgbL9ssnNPSoPVXN4AaABAg.9rebYMGdY7H9viUlqHIxPn","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"resignation"},{"id":"ytr_UgwyYrot6kYsPsGLlRR4AaABAg.9reGNNANkzS9s-mgyxbLXl","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytr_UgwyYrot6kYsPsGLlRR4AaABAg.9reGNNANkzS9s2cWYjwYHB","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytr_UgxUDcaHQU7hVLpShSp4AaABAg.9rdSkgnSv1F9s3w0HmoPgU","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytr_UgxGaW9p18AEp5IotE94AaABAg.9rcov6TyeMk9sFJ0z6J2yF","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]