Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
thank you for this! what an incredibly insightful video. tis a rarity to see a m…
ytc_Ugwc5Y-XU…
G
The autonomous vehicle industry as a whole is much less safe than the public per…
rdc_f6ycvu4
G
I am 77 Retired now. I have been working with control systems and safety systems…
ytc_UgxcY9RWu…
G
I use a voice to text "AI" system called Dragon DMO to write my medical notes. …
ytc_UgxqYesxE…
G
all the shit that would be produced by ai and robots who will buy these goods fo…
ytc_Ugzxlgy6q…
G
The concerns Blake is coming with are valid I think but most probably unanswerab…
ytc_Ugzzac_iU…
G
I'll admit when I'm filling out contracts I do use placeholder descriptions and …
ytc_Ugy8pM0jF…
G
A world government that works for people? We don’t have national governments tha…
ytc_UgzFLnBjv…
Comment
Here's what I find scary:
Imagine that you're a company that sells 'heavy duty boots' and you ask AI to attempt to generate sales through your Twitter. The AI then considers the alternatives and says 'Hey, you know who uses a lot of heavy duty boots? Soldiers!' and then proceeds to create a deepfake of 'Biden calling Putin a turd' in an attempt to start a war.
That's what's scary to me, that even the most simple and well-meaning task you give to an AI can have disastrous consequences, simply because the AI has no morals or values. Human life means nothing to it unless you specifically tell it to value us.
youtube
AI Governance
2023-04-20T16:3…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwRUIARyGksSfH-awl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzNxxBUQZudeqU6ouF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyEaI4bPC5ElF_aXYB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-u0Qr2AZBLAwIVJJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwU3WzFZY1FquYg9ZJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyaWU9xZbaOywwD8d54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxmAUNRg0arqjq_AqJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzfwHY0d4DPfcEY0pl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBPxzTO_lVFozfa4t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrFnCusyI6_WMdFUd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]