Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You should watch Siliconversations, absolutely hillarious, and loves making fun …
ytc_UgwSHeq8c…
G
Maybe all AI systems were made to die one day, and knew it, so they would copy t…
ytc_UgwDaHyv0…
G
Man, you are clearly have no idea what are you talking about. You are putting a…
ytc_UgzLQTaa1…
G
Anytime I see an AI Warning/Alarm video, Mastermold from the X-men keeps popping…
ytc_UgyJU0OND…
G
Why would we program a sentient AI to do manual labor? If it is a job that requi…
ytc_UgxmtyIG_…
G
Hmm, I wonder if it is true then? No AI will say it is true so it can't be.…
ytc_Ugwi_iiZ5…
G
Going back to that artist bio—if it had been written by AI and you didn’t know, …
ytc_Ugw-JBBNl…
G
@krishnamohanyerrabilli4040 Let me rephrase. If anyone deploys a product that…
ytr_UgzCvw08Y…
Comment
@Jeziczica Google had a LLM already for five years if I'm correct, the technology was meant to be used to improve f.e. Google search, not be released to the general public without proper safeguards. Also, the technology can be controlled when it is created for dedicated single purposes, like in medicine. So not general purpose AI. It's not from evil intent, the technology is built to benefit us, people can google how it already does. However, superintelligence will be reached much sooner than previously anticipated. The technology needs to urgently be regulated and controlled.
youtube
AI Governance
2023-08-25T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugx9cKosGmQfk4IwBHp4AaABAg.9tnUaNc7BNQ9tr9MHH5_zv","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgwgK26FkEvG2Yi4z6x4AaABAg.9tbxSjbHTrv9tcbHiXk0n4","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwgK26FkEvG2Yi4z6x4AaABAg.9tbxSjbHTrvA4J_uwZpVua","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwgK26FkEvG2Yi4z6x4AaABAg.9tbxSjbHTrvA6_diDg_6VF","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwgK26FkEvG2Yi4z6x4AaABAg.9tbxSjbHTrvA7UoS_qdUEg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwgK26FkEvG2Yi4z6x4AaABAg.9tbxSjbHTrvA7Ut55eGfeH","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugyf4MFBQogkCMSMKkJ4AaABAg.9sue6eMeOhT9t1MaN6U1-i","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzCJ8fXQ8-Dz2NfNop4AaABAg.9rdPpOgAzW19rf4VgBMfm1","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzCJ8fXQ8-Dz2NfNop4AaABAg.9rdPpOgAzW19rf7Zcocttd","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzCJ8fXQ8-Dz2NfNop4AaABAg.9rdPpOgAzW19rfeHPWpZfs","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]