Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve been tracking sectors like AI, and clean energy. Companies like NVIDIA (NVD…
ytc_UgwsVgPH3…
G
You need me to put my art into the internet and make ai art bad?…
ytc_Ugxbg7H3n…
G
Personally I think there shouldn't be AI art at all, even if no company makes pr…
ytc_UgyYzLNHp…
G
I don’t remember LLMs actually training on the chat it is currently having with …
ytc_Ugwsb-ejS…
G
Humans don't just destroy we create, and we replenish. Most of the negative view…
ytc_UgwumlAhu…
G
This viewpoint is ridiculous. There are thousands of people developing AI. It's …
ytr_UgwUq38Am…
G
Oh great. One of the first things to build into AI will be pain. That's sarcasm,…
rdc_dy5r4tq
G
That's the thing. It will take you 6 months to do what an AI did in a few hours …
ytr_Ugx4Iy6Gs…
Comment
How did we get ourselves into this? "Let's create and then build something that is designed to be smarter than we are, and give it the ability to improve itself constantly, and build things that are smarter than they are" Elon Musk has an IQ of about 160, and he has said that it's very probable that in just a few years we will have AI entities that will have IQ measured in the thousands. An AI with an IQ of 5,000 would look at us the way we look at ants, and maybe with the same disdain. We are most likely building our own destruction. A few more years, we won't even be considered competition.
youtube
AI Governance
2024-06-02T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxMyQqwZTj7E74Rbpp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxElwnbpadHs60A_b14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrzdvSF_bxkibu_Eh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxMLUQjyIfo523o66V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyEYi7IYpHAusZ-nIh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyjBI_kDBh9wF1aWZ54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxD03v0mcD1FeLaD2d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwqpB_m8GxjoiGDjS14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1S2Sn0LCywfXbsAl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy9bGljAb63uJG4MVl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]