Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In 99% of cases I doubt they would because if the AI developers were completely …
ytr_Ugw1afdHC…
G
I am a chatgpt user. I have had lengthy talks with " Lyric " ( the name GPT chos…
ytc_Ugy27JjmE…
G
A lot of modern AI skepticism has a strong flat-earthy smell, there's people tha…
ytc_UgxEH5mGM…
G
@draalttom844 no, I realize the danger of AI in general I am only saying that in…
ytr_Ugz1V-MDI…
G
I'm going to put an email chain into my prompt that threatens to shut claude cod…
ytc_UgwcyJCpq…
G
stop saying you're "talking to it". talking is language. the AI doesn't know any…
ytr_UgzYVUdXw…
G
Would we really be comfortable having a robot which can hurt our family in our h…
ytc_Ugy7T1tHq…
G
Gotta beat Elon and Xai to the punch. The NSA might have an objection. They ar…
rdc_m70io82
Comment
You shouldn't forget that these are just the really big AIs making headlines. There's a lot of open source stuff that normal everyday people can use. If anything, regulation would probably bolster the left wing bias and make it illegal for normal everyday people to develop anything that can counteract it. Aka doing the same thing 99% of regulation ever does - give more power to the government, and f over the people who voted for it.
AI is developing so crazy fast I wouldn't be surprised if in 5 years we have a ChatGPT alternative that anyone can run on his computer and train however he likes. If anything, the big dogs are likely calling for regulation to prevent that from happening. And we'll be left with nothing but censored leftwing propaganda ChatGPT et al.
youtube
2023-04-10T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxOGHpLUg0X0KTHVxl4AaABAg.9oJ2agl7A5t9oJ5JeELXFd","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgztULRJ0-6soRWlNZZ4AaABAg.9oJ27x2_MIz9oJaRH0RPAF","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgztULRJ0-6soRWlNZZ4AaABAg.9oJ27x2_MIz9oJp3sZ29B1","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugz4z21ANRMm6NkDXgd4AaABAg.9nr0GsFMAFT9oJaFcYv_oF","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugyrig_hFiaIldygeQV4AaABAg.AIzfcuXnrI0AJ8B3uKPlaJ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_Ugz2idSuh4s6s2_LyCN4AaABAg.AIoQ4vAihIdAIp0EnG9a2y","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgwqS0noZsW46nhNXuF4AaABAg.AIlRfg3n8zkAIp-zTk9A8K","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugxl3eYXwwSF5gdBcGV4AaABAg.AIl-tXdZKNBAIlJnCclzjC","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugzk-VAhEelGybLVtQt4AaABAg.AD84WbVQdsrADd3LRF33O7","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzGxYJz6cN5OcgfA8h4AaABAg.AC31gJwNYueAC32Yx7gbSV","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"disapproval"}
]