Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Instead of wiping out humanity it can send people to other habitable planets and…
ytc_UgwYprA4Q…
G
this story is a year old.. lol, waymo drives way better than most of you do in A…
ytc_UgzkmWSyq…
G
I think Gemini misunderstood the 5 robots vs. 1 human one; he said, "I'd pull th…
ytc_UgzKq9ieb…
G
Then they say it's their art. No buddy maybe the 5 words you typed out are, but …
ytc_Ugzb2ThMB…
G
Alright guys here’s an idea let’s give AI a gun and see what happens 👍…
ytc_UgxP-cHnV…
G
If AI is made in order to make human life easier… maybe humans should put a stop…
ytc_UgxBREe-c…
G
It’s a clip from a short film bro it’s fake. The real drone tech looks different…
ytr_Ugz3475Rc…
G
If AI knows people destroy the plant. It can remove us, like he said it controls…
ytc_Ugx0sO7KR…
Comment
The same thing happened with missile technology and atomic weapons, the inventors warned of the possible outcomes and they did it anyway and look how that turned out. Now we all have nukes pointed at each other in some sadistic Mexican standoff waiting to see who blinks first. A.I. don't have eyelids!
youtube
AI Governance
2023-07-08T06:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyN1UXXboZm_AciGQ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxDAxQmh0HgZFTmP0V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy4mNQahFUChpq77-14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugywk6BTjoptHk-KonJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzvH-8yKyZwd_OHu6J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx5xVJmuoC8qA2rUBd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxufjRrGLF6vClI1_54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwhZBdzPzG-CxW9-Ex4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzSYunAj7fzrYSZuoB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwGHuKJFh8b_6XJN4N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]