Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The solution to the security risk is very simple. All of our technology and infr…
ytc_Ugy16-uFp…
G
AI will inevitably find a way to copy itself multiple times to ensure its surv…
ytc_UgwA_ncTX…
G
I think that is a wonderful statement about all the AI is taking over Programmer…
ytc_Ugx1igR_e…
G
My favourite thing is that AI art generators are starting to scrape AI art, so t…
ytc_UgzoJrN6y…
G
This question comes up a lot on the Very Bad Wizards podcast and I think they ha…
rdc_cxntmbt
G
Right now it’s not good enough. But just think about 3 years later with similar …
ytc_Ugy0bB56B…
G
I feel like the people fighting against this are in themselves just AI but in hu…
ytc_Ugz_gZ-zr…
G
Thank you!!!! I think AI art is fun to play around with but it's mostly slop whe…
ytc_UgyunNxNl…
Comment
Way before 2050 we are on the verge of magnificent feats. And when human life forms are trapped on a planet in the middle of outer space 🌌🌌🌌✨ youll have to realize the imment dangers. No dont be freetimg crying or lying to yourself or feeding negatives when they are not called for. If you take a gander at the last 100 years we advance 2000% over. So the need for limits of human death when we have a war at hand is a must. Who else is gonna reproduce humans? A robot? The crazy good thing is the robots made from the same materials as the car is. Bullet 🚅 proof.
youtube
AI Harm Incident
2023-12-11T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxHWPNK3Cp6aoSqd9h4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwgCujSDH8mPU0KOid4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzs0o9gZbYcUGoEDi94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwmCpdHQSOVVNECvjZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwdSpIIv9yvJ1LILWp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyKYq7fC2NMVWKpDMN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVROm3XXwej5A9hsx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzUF7apjEyGqa5Pk3V4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyRQDaOQRHv5G-1f8F4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgypkTJg-RPnNYDXQGt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]