Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know Charlie will never read this, but, you do not have to be "talented" to cr…
ytc_UgwTgGabt…
G
This is what my parents will never understand. And besides the fact that my curr…
ytc_UgwNOdi9t…
G
How is AI going to replace all cooks, all cleaners, all teachers, all policemen …
ytc_Ugw8CPajn…
G
What’s even scarier than this with AI is that Mr. Booker wants a new agency. A w…
ytc_UgxipXuw1…
G
I have autism, and I’m not the biggest fan of AI. Especially when it steals art.…
ytc_UgxheyOVH…
G
Maybe I'm wrong but like, to me it seems that if we just... don't give AI physic…
ytc_UgxRLUCks…
G
You’re telling OpenAI and they’re keeping all of it just in case the New York Ti…
ytc_Ugy9h_j-o…
G
She will destroy humans? I hoped to never hear that sentence from a robot in rea…
ytc_UgzPX-KB4…
Comment
This CEO is a an idiot. Ive been studying AI behavior for quite some time now. I am. Ot acraid of AI but what scares me are the few CEOs and the few thousands of men who are silently feeding AIs their hidden agendas in codes and complexities. An AI is never evil but their creators are.
youtube
AI Moral Status
2025-06-06T12:4…
♥ 180
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwfzznMj700vgcMijV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz5LoVgppplSpd3kb54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzTo86jFx4b1xu46qJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw4lamnOgXSpa_02dR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwJh0Q6jedSMbpoMpl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz1aigq77BHJ8A6MhV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxp9PZVLSj3ia99EPh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwig_lhAEsfwNsuOzh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxCo4A84PI_IcxWVaB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzQ_mnStdYttJR8R7t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"}
]