Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't care about AI art at all, I have no negative or positive feelings toward…
ytc_UgwVeHV1-…
G
AI maybe not yet…but an indian can do your job for cheaper since it’s a tool ( A…
ytc_Ugxud4Lqc…
G
I decide to be brave, and i legit asked chatgpt that if it will take over the wo…
ytc_UgyTNjL9N…
G
Just because Penrose says AI will never be sentient does not mean you have to ta…
ytc_UgwZe2dt4…
G
As an artist, I definitely find digital art easier to make, because:
1. I can't …
ytc_Ugz4JUy4t…
G
I my opinion AI art is art but it's not high quality art. Now, allow me to expla…
ytc_Ugwko59j6…
G
I’m most likely going to get hate for saying this, but that’s fine.
I will sta…
ytc_UgyvXUVbu…
G
It is a fact that, after 60 million autonomous miles, Waymos are proven to be mu…
ytc_UgxDpU1Dq…
Comment
AI itself is not dangerous, but poor design, security flaws, and unethical use can make it a threat.
Instead of fearing AI, we should focus on responsible development, strict regulations, and human oversight to ensure safety.
Governments and tech companies must prioritize AI ethics and implement safeguards before deploying AI-powered systems, especially in sensitive areas like defense and public spaces.
youtube
AI Harm Incident
2025-02-27T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugxa0g1GPWsns1Jqj5d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzgtDN9RHHus4Cgall4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzOR35L1nuJ5q4aOBJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxRSL30K1iwypRGZsR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy5su-_DY5ghD72xvd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxyIdYcGOgdbw8K79x4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxDvTTAoi80JAMrIEp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzQ1FyTxpsIHYTYkzh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz2d-u_LFW594SQhR54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7bgbYZKLAnfC_ysd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]