Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anyone else remember the Typing Pool, a Word-processor? Do you even remember tak…
ytc_UgzzzwDyQ…
G
Why fear AI when we are already most likely controlled by AI? Someone said that …
ytc_UgzaAy8ut…
G
Humans are doomed to self-extinction even without AI. It is actually the only ch…
ytc_Ugx2lptWg…
G
Sentience, eh? Sounds right. If we aren't our might (body), and we're not our m…
ytr_UgzeAqjHQ…
G
It's not as if something like this is revolutionary. What we see are are "out of…
ytc_UgzDAKPmt…
G
Blame is on India since Malaysia is allowed to sell to India, that’s in the clea…
rdc_lu9bokj
G
The prime minister of italy had a deepfake made of her and she took the whole th…
rdc_kwbsho0
G
0:52: 🤖 Creating technology that allows AI Bots to be smarter than us might be t…
ytc_Ugx2AgI7y…
Comment
We can't 'key' an AI system to our own needs, if we don't understand what we ourselves want. If you want to make the machine decide what our future is, then you've put the power into the machine's hands. Sure, we can act as guardrails; but if it makes a suggestion that sounds good to us, we won't know if it is the right choice until it's too late.
youtube
AI Governance
2025-12-05T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUxRpcW3m4Oa8MQOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNgHvetyJmP3wNpPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyMDC8m8jDdWEVovjR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz4lg1qUiS4XAVXo-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRG72du5S7mL9FC2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyQVBtuL3R9eNXG3yt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyq8p95FKR0z5RJl5d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxnrwHG1jZaYPrmbth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdTgHylS6wkMQUOsd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyK9fPBALTcFds3HAR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]