Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How much did they pay you bro? Don’t throw away your credibility over $$ because…
ytc_UgzEgsWV-…
G
As an Artist, I feel that AI ART can be useful in making art in one sense: makin…
ytc_UgxEMnhjP…
G
Last year I started realizing we are no longer the decision maker concerning man…
ytc_UgzbcmFIj…
G
Remember, once ChatGPT was asked to tell a person Windows 11 keys but it said "n…
ytc_Ugz80TgFZ…
G
Nope, I don’t need 100% reliable, I’ll take the safer record of Waymo before I g…
ytr_UgwSYprJ1…
G
robots: *unplugged*
other robot: A MURDERRER!
me, an intellectual: it isnt dead …
ytc_Ugx8d1lOl…
G
Not unexpected. Chevron filed a loss a few days ago which was its first in 14 ye…
rdc_czlorrw
G
Grok, Claude, and Gemini are the coolest ai’s I know, but ChatGPT and deepseek, …
ytc_Ugy9ukxjg…
Comment
Nope. AI would be turned off well before it got to the stage of humans no being able to pull the plug. Although, some people will undoubtedly be harmed before the plug is pulled.
youtube
AI Governance
2023-04-18T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugzjp48MU4aVTY7qgU14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgygyAcocGEuLw5bpIR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwnF99UcYHoSM-CMux4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgyWiXWVGDTw3wbDjcR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgxJiH-PJKGLw3Xed-Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwIdLO3GyQMrSTsIQF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxzhcbsLYO_8Ac7Hp54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugy2_ImFW9MgK1NE2qZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyTz8O2Hk8MWNxqbdl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxTiA8xhvhcwTElNpl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]