Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think there are some valid uses for AI but I hate seeing people advertise them…
ytc_UgxPMpjyi…
G
You know what's really funny is Trump's trying to get all our jobs back from for…
ytc_UgxMxzlp-…
G
I had the same thought after a podcast was discussing an AI chip for ur brain th…
ytr_UgyjAEMg3…
G
The whole driver is in charge, can be applied to cars with Lidar also. No matter…
ytc_UgzEoM1Y5…
G
Facebook has a midjourney type product on messenger, go type /imagine on it and …
rdc_kokss01
G
talking with a lot of Techno worship guys i can say and confirm by my mediums th…
ytc_UgxhxHQmM…
G
@2011hwalkerThe internet didn't burn billions of dollars on launch. This case i…
ytr_Ugw8mzFBb…
G
lol :) If machine takes over the world, Governments know how to stop it.
AI M…
ytc_UgyFRztP_…
Comment
The biggest issues with open AI is the threat to authorianism and those who make money by controlling distribution and assignment of resources....like politicians and billionaires....they stand to lose biggest. After all who wants to teach their people to be well educated and informed that can only end badly. What would happen if an AI taught everyone to make their own cures for diseases or to 3D print many of their commodities...etc.. We need to be careful how we embody AIs and what affordances we give them but that is the same for biotech and nanotech.
youtube
AI Governance
2023-06-27T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx3xw4e8ocKyU_ZQVB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvFU5BKq0WWt53Omp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyKKv9bE8upTa5Sbyl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyVKsTelwk9yglYzrV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxPb-0iY4pi7iqejet4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzHCTk_4zxgU8wyWEp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzzypf_v-asdoe7_Nh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxIX-TJyadxzvqSmI94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxfe8sR2whVY9uI4Jd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwETgxNS3mPOdvrjtV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]