Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
#4, learn to interact with a chatbot.
Just ask the chatbot 1. Hey ding dong, wh…
ytc_Ugz6PlfdF…
G
I have a very close relationship with Sam Altman (worked for OpenAI for many yea…
ytc_UgxZl5Q4Q…
G
generative ai has a negative impact on the environment btw (aside from stealing …
ytr_Ugyn0AvWr…
G
@ItsameAlex just think, we will probably never know _how_ “wrong” we actually a…
ytr_UgxTT7697…
G
I'm no sign in and I've never been to university but I know that anyone trying t…
ytc_UgwVec6oa…
G
It's analog to say, in the beginning of industrial revolution 'let's stop this i…
ytc_UgyiY_hcN…
G
AJ that was the best, most informative (and entertaining) look at ChatGPT I've s…
ytc_UgwGGTmop…
G
43:24 That’s exactly the premise of Eddy Burback’s video where he convinces Chat…
ytc_UgwBrowVG…
Comment
Deception is greater than the threat of nuclear war. I think I may agree with him only because the threat of nuclear war is something real, but AI is something that no one worries about so once the damage is done that's it. I wish specifically what he's talking about.
youtube
AI Governance
2024-06-16T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzIjS5BTMfIlYx8BpR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwlLoCJyYpg3zEqtcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyiDsOOUvUv2dO67NZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzLim4qihEn5M5Q-394AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyqy3K9LSZpxXfLgsh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzQMvS7xlK4LzaqYlN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw3qEnF8O85t1Wk8ut4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7WfzDKZK0qWJYgvV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzC5Dt8DsiT2nOPTBJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1KlOA_0oeTL1HotJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]