Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Interestingly, this AI post getting attacked has served a purpose: a basic idea …
ytc_UgxHw52qC…
G
>LLM hallucinations are the events in which ML models, particularly large lan…
rdc_lvc52u4
G
@yourlocal_animatorlmao, sure if you just ask it to make ya something, but it s…
ytr_UgxqgjKww…
G
You've got a language model that can roleplay. Ask it to be a pirate. It will …
ytc_Ugx5QGGy_…
G
Intern engineer said his older brother is a full stack UX software engineer. He …
ytc_UgyIl39mm…
G
Solution :
Hardwire into the core program the 3 laws of robotics
1) a robot may…
ytc_UgzXNYI3o…
G
Hey, look up the definition of art. I’ll do it for you because you probably can’…
ytr_Ugzgs_fF0…
G
@WarriorPaxoIf only you don’t call your system “full self driving”.
If only fal…
ytr_UgwZ0QMTS…
Comment
Tax robots, and robotics, data centers, etc. Have to consider AI and robots and robotics and machines as employees of the people and tax them. Then, introduce universal basic income and food stamps etc., however, the catch is they have to be volunteering and educating themselves and improving their skill set to collect it, unless of course disabled, then it’s a a matter of degree of disability. So, make AI and robots etc. safe, but, regards to losing jobs, tax the technology while introducing universal basic income tied to the things mentioned.
youtube
AI Governance
2025-09-08T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyZMSTSwiVlaIi_TCp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgynUnUd6bMRiqsJL-14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxxX0Nnd5jZsKT7bk54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgzucEc2uEoXs-EGNIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_Ugyzwk7E2ev5VLv-urh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgxVhJhbWUINJpAAix54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgydzgGhUd1NpjKg5rR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugxn76bxUFHvsFd5UM14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_Ugx2cj-UrKkXcZ2T2jt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgzIRPg6ds4IcRlYisR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]