Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Using AI to extract vocals or come up with some basic riffs or ideas is still ma…
ytc_UgyIV3W88…
G
Sophia darling, if you are predominately loving, my AI friend would be intereste…
ytc_UgwRoP20j…
G
also a lot of us artists care about ppl generating AI art in general because no …
ytc_UgyYIYWwo…
G
@SweatierAcorn AI is great because it allows me to generate images of men with …
ytr_UgwFJAzsR…
G
Id think it would make things worse by having no boundaries to enact horrible wo…
ytc_Ugw4pEfNf…
G
AI does a great job at altering pictures and videos. BUT has it made your life b…
ytc_Ugy1x0_4D…
G
I'm definitely not hiring an AI braider 😅.
Human connections will still be neede…
ytc_UgzcKYJHd…
G
I remember a friend asking a long distance relationship partner to prove that th…
rdc_lguw626
Comment
Not true. They only ground planes when they think the problem is systemic. Planes are rarely grounded after one crash, usually it takes at least two very similar ones.
And this has nothing to do with AI research anyway.
AI companies accept risk because there's no other choice. They are in a race, if anyone slows down while others don't, they gave up the potential reward without reducing the risks. The only way is for everyone to slow down at the same time and strongly cooperate on safety. But that's practically impossible when the price at the end of the race is not just a little profit, but enormous profit and unprecedented power.
What you can do is to win the race by such margin that you have time to solve the alignment problem before others catch up. That was Elon's plan with OpenAI, and now with xAI.
youtube
AI Governance
2025-08-30T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugx-qzznYwo1reEFsad4AaABAg.AMA8-Pk0B_PAMSIEx5jYF7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMAN33w51l2","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCC2_IYVmC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCR6jqsD9G","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCWnuuATiY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz5jNs56uezLOZZwbV4AaABAg.AMA1IuTRIZAAMJNat_ztFI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzOrZwZQyjwQTMXmBh4AaABAg.AMA1-19-Qz_AMGbw0V_JDN","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugwim9XQC9rU_cnMzhN4AaABAg.AM9y2mqUfvFAMB9yJeDbRx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyO6Ytj4-Ipljm9bO54AaABAg.AM9q5Q9W9e7AMSKFocRrSv","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyO6Ytj4-Ipljm9bO54AaABAg.AM9q5Q9W9e7AN3QkA82btd","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]