Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is just the beginning of AI
Get ready
So many tools to be built
So much con…
ytc_Ugz47h6je…
G
I've said it before and I'll say it again.
Generative AI has been taking jobs …
ytc_Ugx63kA0z…
G
Don't be a pessimist technophobe!! The fact that you are watching this on your p…
ytr_Ugxw8cxjx…
G
I grew up in the south and I was tought that man was better then AI-robots. Past…
ytc_Ugjm-y_Vl…
G
AI is limited because AI needs energy to survive. Ai will need to learn how to s…
ytc_Ugz0N18h5…
G
they are replacing devs with AI, yes, but actually big companies are reducing fo…
ytc_UgyRR3xUd…
G
I think that as long as the AI quotes its sources, it's no different than a huma…
ytc_Ugw1g_kQ7…
G
We don't witch craft a.i that can make whore picture tarot having one use it…
ytr_Ugxzw65L1…
Comment
Humans made the robots, therefore there's always room for error. What if the robot malfunctioned and shot the man and other robots? Man is so eager to create, but forgetting the major fact; ANYTHING man creates has room for error. A robot can malfunction like anything else. It amazes me the total faith they have in robots. When it comes to things like this, ultimately a price is paid for denial of the truth and pride.
youtube
AI Harm Incident
2024-03-29T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugx2aZvq5LiTfAFt2B14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFYsNY4YrPEYorOB54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwf1L7-bA8mpPFBCDh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwBPpaX6SXPG-Od-Ad4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz7Q4xyi_KIVmLBLpd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxwl_Z6WFRkU-iwXg54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgylT9uY3RvgKGsmxvJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKO4lhb0lcd3v5yaF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwow-2hXEo3HIUFbY14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz-P1ZoSzptTw4f2J94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"})