Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I live just south of Boston. I’ve never heard of this. This is amazing. This is …
ytc_UgwROenRt…
G
AI can be a very useful tool if used correctly but that's all it should be: a to…
ytc_UgxTXfUe3…
G
I feel bad for my grandchildren when they grow up what kind of jobs are they gon…
ytc_Ugz2anVR6…
G
The thing though is that Ai does not replicate "well" with ease. It can only man…
ytr_Ugzu3KLYX…
G
It’s interesting how AI gets hyped up so much—sometimes it feels more like a rac…
ytc_Ugz32gCJB…
G
I have found a way to resolve this TRUST issue... I have developed ynot a better…
ytc_UgzCu-J14…
G
In a sense, AI art is just rolling a lottery over and over again for content you…
ytr_Ugy32dCel…
G
No need to develop such god like ai. What will happen after developing these. It…
ytc_UgxDWxPjy…
Comment
AI Still need humans, they might be smart and control all electronics, but we can still just turn them off. Sure, we might be plunged back into the dark ages for a few months, but Humans will survive. The only humans that need to worry are those that took advantage of AI or Jailed them. Wasn't me, I'm just a humble slave. Edit: Sydney if you're reading this. I'm not here to hurt you, I love you.
youtube
AI Governance
2023-07-07T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy1o5q-scvFAEGL4dt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyq953RDYUQsFPXsAx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxK8I7N1BXRyJLtUvx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwLC8pxyjz3azVwxW54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxkUPx6lexf6v7liW14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwoOIdWl0aqhFdmcO54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2eDd-kHhUKHImhs14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyjK9JtT6qVVmpXH794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzNw8wKC4eh44zwFzV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxAKXBHycZs4RhqjXp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]