Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i didint watch the video, i just read the AI resume made by youtube AI…
ytc_UgxENpBsN…
G
You mentioned sci-fi, and human manipulation by AI, and I've got to say, I've be…
ytc_Ugwp3_F1G…
G
Basically people are gonna be freed up to do more things that weren’t possible b…
ytc_Ugwy4eEXb…
G
Someone once said it is always the darkest before dawn.
Hopefully that will be …
rdc_emnoj9s
G
I wish we could just give a rough sketch and have the AI complete it instead of …
ytc_UgzjFdurl…
G
Estamos transportando nossos jogos para o mundo real. Em breve, os robôs estarão…
ytc_Ugyiuvd0r…
G
We always project our human emotions onto chat bots. We’ve been doing it since …
ytc_UgzgkC7N3…
G
Ai is good but human and human's knowledge is far greater than this technologies…
ytc_UgwT1b2r9…
Comment
We know there is no stopping AI. Just like there is no stopping mosquitoes, or human hate or sunshine. There are solutions! The solution is to accept that these things exist and identify a path to make the undesirable impacts disappear. Would mosquitoes be anything but an annoyance if they could be prevented from transmitting diseases? Would human hate be a problem if it was blocked from injuring anyone or thing? Is sunshine’s problem if you have shade? Look for counter actions, not trying to put the genie back in the bottle.
youtube
AI Governance
2023-07-08T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyO13mF0mx1G1LmyY54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxW35pMlkLfTEtfR4Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVRy0E7puJ4yLvRo54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz3PBjEy7YAxRt49Pd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwgVyF4Gj9s2-qY9ux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2NN8ZAwgb8uL-gV54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx6IZFc1W6Jr759ikZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzEkc8rH2jmnbLTlBh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx-QsGvAz0c2MhtgSJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwgvZAX3XbPEo2DPd14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]