Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How are they handling AI when they need like 6 guys to change a light bulb?…
rdc_lv8lnbd
G
I draw with my hands so I can show start to finish how I don’t use AI.…
ytc_UgxV3h_jw…
G
This gentlemen needs to immediately be put in control of the safety of AI so we …
ytc_Ugzd-_gUB…
G
Humans; we don't want to work our whole life!
Also human: we don't want AI or r…
ytc_Ugxy7cHxA…
G
They really just laying people off because they plan on replacing them with over…
ytc_UgxM9tUrZ…
G
A case that I never thought was bad but I would love to get opinions on, is usin…
ytc_Ugzw2Agx1…
G
Sad to see.
But we kind of already know they broke copyright law. How else cou…
rdc_m21vgn3
G
AI art is a perfect example of how and why technology is going to rapidly replac…
ytc_UgwG42NtL…
Comment
The issue is that once we opened the AI box, there was no way to put it back in the box.
If one country doesn't develop it, another will. So we are stuck in this race to stay ahead of rival companies, risks be damned.
Any laws we put in place only work if everyone follows them, and humanity is not known for unanimous agreement and cooperation.
Hell, not even 50-60 years ago, we were ready to glass the planet over differences of governmental systems.
youtube
2025-11-10T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyfINCepppSmJgQ4MJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzl12FhB9NM3Uy0y9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwoBphgNtl_M4qBMux4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxAK5zs5VthtBQ5ITR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyRQ_p8nA1XA9BO2rF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw2Zbvq7RteVjKmrq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzfwBOHXgv86wTPl3x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8iXhI3DLAcJYXbYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzEJDpYrwbmcSezPNV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyNVQcUqafyi9wRODV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]