Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@howdareyouexist you seriously think that? Take music AI. They don't dare try to…
ytr_UgwEPDhgm…
G
I used to like watching Dr. Eric Berg as a young adolescent. I came back to see …
ytr_Ugwi5xWb_…
G
1:31:00 This was the original plot of The Matrix. Humanity was used as a compute…
ytc_Ugx1Praam…
G
They have told us the power needed to run all this AI future will need to be dou…
ytc_UgyCRixif…
G
I’m usually against AI and don’t use it in my daily life, but I do sometimes use…
ytc_Ugy01Hv9W…
G
Maybe cutting on tv news would help more? I selected few channels for news, talk…
ytr_UgyiwuzZ9…
G
If you ask a researcher to cite a famous quote, they can. If they need to cite s…
ytc_Ugy0zjhwU…
G
There is no way we would ever have a successful global effort to do something ab…
ytc_Ugx_3NnE7…
Comment
If Elon and all the other signatories will cease their AI training during the moratorium, I can get behind the practical effects of what the letter is asking. If they truly believe the threat is that significant, they should be willing to forego their own advancements while their competitors are in a holding pattern. If they are not willing to pause their own advancements, then they don’t actually believe their concerns are as founded as they let on. And I’m an Elon fan and tsla hodlr.
youtube
AI Governance
2023-03-30T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwX1xYnIHjHTx9XDyZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYwY5t9BS3npIMR-x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzCX5jx9mry1tknUG94AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzBWVL8bQXTItp50kV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxZvMWxavv5ve7WFuh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxR1mryg_4K9SUkUGZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5yMG25in0rar6Hg94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz0ls2EJ26blnHcQD94AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyYuERQswURjCQ6MGV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxGhWY3iwyrAAINR1N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]