Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If driverless trucks aren't mind blowing enough for you, wait till pilotless pla…
ytc_UgjxNK4Ei…
G
The amount of land and energy used to make this work, probably wouldn't leave en…
ytc_UgwbR0htY…
G
Great, let's add some more anxiety to our plates! If AI doesn't kill us in 100 y…
ytc_Ugx7y_he5…
G
@arreshubham do you benchmark human intelligence too lol !!! AI is about the pas…
ytr_Ugx0Hqpg4…
G
a lot is just part of the marketing hype. these models are still not "intelligen…
ytc_UgxhxjhDz…
G
I hope more lawsuits will go against openAI next. This company has been stealing…
ytc_UgzdvUSZm…
G
AI art fanatics are the new cryptobros. Extremely hyped and out of touch, but wh…
ytc_Ugwrg_N2Z…
G
+Chad Cansler But we 'think' we're alive too. We could be simulations ourselves,…
ytr_UgiCjS6CN…
Comment
Something I've been thinking about is why there's no easily accessible single page or short piece of media where you can find the main arguments, written in a very concise and compelling way, for why the reader should be very concerned about AI safety, especially given the current pace of things. I'm thinking of something like the AI 27 website but without the storytelling or very technical bits, just a few pieces of logic and observation almost everyone can understand, possibly with the choice to look deeper into any specific part.
I could make it myself but I don't have the network to properly vet it or the followers or reputation to make it spread.
youtube
AI Governance
2025-08-23T08:2…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwim9XQC9rU_cnMzhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyO6Ytj4-Ipljm9bO54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzBUp9cqxp-Q-SKku14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2EiMn64SuvupH3-V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_vIeMWyPOX-y2BsR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyOOA_hESTcxGbeUjt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1T_34YaiGCD0NUaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz5YYrTA7lZg1omUoZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzbvp-J4ZvzkrKuSpl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxRl34qJ6wXvrpB-Ax4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]