Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use ai to entertain myself. I train models and Loras just for myself. I litera…
ytc_UgxC1ry-C…
G
It’s fake, look at the iris. It’s totally different without face, it even doesn’…
ytc_Ugzp4kx55…
G
@michaeljames5936 i think if theyre used that way, driverless busses make more s…
ytr_UgysSjP8K…
G
Not to mention how we've stigmatized mental health/illness issues in our society…
ytr_Ugyhzf6G0…
G
While the development of humanoid robots raises valid concerns, it also offers n…
ytr_Ugy8tRrzB…
G
Here's where you start you own AI art competition, goober. Go play with your toy…
ytc_UgwG84HMz…
G
AI just tells you what it thinks you want to hear. Nothing to see here, move alo…
ytc_UgxcpU2q2…
G
If the robotaxis can’t provide a cheaper product than Uber, they won’t succeed. …
ytr_UgyRI_FHQ…
Comment
Well this looks like when he is about to retire from working on the technologies, he is coming up with the problems what AI will have, when he was young he took it to the best of his knowledge.
Any young technology enthusiastic will vouch for improving the technology and move forward, we cannot stop, if we try to stop we are trying to stop the human intelligence, so its not just AI, but many more technical advances we may do, today we write programs but what knowledge we as humans have. But one day, AI will write on its own based on what data it has and it could be wisdom programming based on its learning and intelligence, its gaining.
We are done with basic, linear, and OOPs programming, we should now with AI enter new world with wisdom programming. Every person has limit, and we should move technology ahead with whatever we can contribute, and not forcefully stop it.
youtube
AI Governance
2023-05-04T05:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwrFT2tldmlCMX8LLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzWDP3y7FAZf13_gyN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7fZPvQLXSjD7tChV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwz65qMSnoS3LKBn6t4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyW4ixx0JUu5bEYSQJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwtl0mCSYyP9QJZncd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyaJqeOME8IM10K8t14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytGwjxRZPffeBIcBd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxY9Hln_Gko4WU1MAt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmoN-iMDHTmijFLdd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]