Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But why would people be not using ai .
Everyone will be usimg it like a smartpho…
ytr_UgyVNnlz-…
G
Guys thats so stupid, chat gpt is language model not AI, anybody who is scared o…
ytc_UgxUaJeZx…
G
Completely agree, I think that AI generated content is poisoning a lot of media …
ytc_Ugw2-wtuW…
G
Yup got laid off due to AI. But the company also was losing so many clients so I…
ytc_UgyHPJ6rT…
G
I'm not pro ai but drawing can really suck for me hard. I understand eventually…
ytc_UgwIUgp1e…
G
Jack Tapper scolding AI company CEO's about the death of children... It's almost…
ytc_UgxAn23IL…
G
These warnings are to convince people invest in AI . People who use it already k…
ytc_Ugw-nl2Y_…
G
The Halting problem proves that no algorithm can fully understand all algorithms…
ytc_UgwnOPgjD…
Comment
This interview is ridiculous. This is just another fear tactic to implement self serving policy. He pretty much is postering himself up by degrading others at openAi so he can promote his own AI business and use government regulation/lobbying to support himself over others. Look at solar law requirements in California. You think he had anything to do with that? 🙄
youtube
AI Governance
2023-04-18T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy1yXSq9QA_S5Zh0Ml4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw1aD5wNQOEPQA45EB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWY1ltIrhtZcRccI54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwEs4KRepJrYTgOnLV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxyyoHfeGjorKdzGhl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwxxaW1E0AfiVTJBrB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlkJogxkpjjW37CrJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxRKldcsk09Ai3Bnnh4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz1glUxdIvJjCroRJN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzcrr0Fj-kMBtsPEp14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]