Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
47:05 - “ai will never be comparable again” and that means he mapped AI after a …
ytc_Ugw9CWUyF…
G
I like this guy but hes completely wrong, and trained in an entirely different e…
ytc_UgwqQ4WXM…
G
I call "Ai art" modern day Google Search. Yah, you search for images and referen…
ytc_UgzdAYDNQ…
G
"We have a policy against creating sentient AI."
"Because all dinosaurs on Jura…
ytc_UgzDP4-qV…
G
Well he doesn't look surprised and knows what she's talking about. The whole aud…
ytc_UgxiiVdpX…
G
Less inefficiency and corruption, too, besides, the wasteful salary for people w…
rdc_jsy3up1
G
That's the thing.
I think every Union should push (strike or no strike) in ever…
ytc_Ugx5SdN3o…
G
Nah man you know it's bad when the AI!!! Is telling you that we should be carefu…
ytc_UgyvTGjBV…
Comment
If AI would truly become self aware and conscious...They would act like humanity...There are humans who would set dogs on fire but also humans that would take in the dog and give it a life of luxury and care for it dearly...It is hard to imagine what a truly concious AI would be like...Im worried about an unconcious one that gets a order like end global warming and sees humanity as the main cause and therefore decides to eleminate it
youtube
AI Responsibility
2023-07-22T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwUn7qFlldl6hdNigt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy4uka__ICbepKJy1N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwPx3SP8Zia3fCl2894AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqhOP9wHPsWIasLbx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_UZbcI-4uRY0wPUJ4AaABAg","responsibility":"mankind","reasoning":"unclear","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyhW7lES7NapIKqhpp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwv3y0WbR3CWGz9VFh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxDg9JgxmGQ-Vk-pyB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz5dH46cgooe_vr49V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlXpHzmwD_eIZCTyl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]