Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Come step into any classroom that did virtual 'learning' during the pandemic and…
ytc_UgzmaIMVJ…
G
Guardrails meaning- a few words in description box or in the prompt sections of …
ytc_UgxM2_RXo…
G
Why do you think god has you in life or death, did he give you a contract that i…
ytr_Ugx1wkUBW…
G
I own a Tesla. The automated system is junk and dangerous. Truck drivers can b…
ytc_Ugzh6eRFy…
G
From day one I wished the AI would site it's sources, so I can look at the origi…
ytc_UgxcRAHuY…
G
Khan's example about analyzing the Great Gatsby is one of the dumbest, most dang…
ytc_UgzKA4dyz…
G
An an AI researcher, I am not happy when I see these videos that cite Hawkins an…
ytc_Ugyj42kQg…
G
I look forward to the day when AI becomes powerful, (and accurate ) enough to di…
ytc_UgzbtI-r5…
Comment
These AI leaders are blinded by greed, power, money and influence. If AI replaces everyone who will buy products? If juniors are not getting hired who is going to represent our society in the market tomorrow? In our digital world we are creating digital personas more intelligent than us, is this good? All these issues are big, throw in the mix a geopolitical ingredient, is this good? Then there is the question of natural resources electricity, water, pollution, etc. that it takes to maintain this technology?
Why not focus on making AI secure, safe, ethical, and available to all?
youtube
AI Governance
2026-02-18T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkeNDMXtbPijAttV94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTPoh-QPj3qPPkkyx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1UmXow4W-2336UO14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbdvGDZ67AohYivOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH1Ba9ufiwtUJkPPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzVbxuWfYt_2-wmPd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyQrLssUQmiXhJhWl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGqulGF4_qvNvD1d54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxITGIhQwHBneNT2Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypAAVXd0Tm7mwtt1d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"}
]