Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The human simulations predictive algorithm ordained the conversation to be as su…
ytr_Ugzpp8pxU…
G
Am going to kill every robot 🤖 I com in contact with in the next six years…
ytc_UgzMPx4VY…
G
did i miss this? are we talking about remote piloted drones? when did we give LL…
ytr_UgzLLCL_V…
G
One reason AI may replace humans in many jobs is because, beyond efficiency or p…
ytc_UgwWEjeMl…
G
To answer the question of how we stop the AI companies for 50 years, requiring g…
ytc_UgzboDfMa…
G
All that is clear as day, but there are good sides: Education, language and brai…
ytc_UgwuoHQlF…
G
ai doesn't quote references like normal are obliged to do. if ai makes money it …
ytc_UgzlVa36P…
G
If AI learn based on what we do, but also replace what we do, how are we suppose…
ytc_UgxEVUOJz…
Comment
You should watch the channel of Robert Miles (AI safety expert, academic) if you don't believe this is terribly dangerous. Even very simple AI learns to fool people and come up with emergent properties that we did not anticipate. And we don't have the slightest idea of how to prevent this. Every suggestion was proven to be outsmarted by AI much simpler than GPT. I'm a software engineer and did not expect GPT to figure out and deeply understand humor, but it did, without any training whatsoever. It just came out of nowhere. All of a sudden it understands joke. Sure, it could be just statistical, but isn't our brain just a statistical pattern matching engine?
youtube
AI Governance
2023-03-30T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyP21OJITUf28m8SmZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz8kK7S8vizixYEwM94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy24IcvT33mNiOfLNV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgwDI7o7djFhXuMglF94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwO3I9Q8AB-G5g-tB54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6YNL1_pxZRx6CBrF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwPY6X4K9eg_UUewHF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxiS9C6X4uQlX-LFQF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz5J05nhkHszglrJdp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy9xdoC6-FHBKpqyWd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]