Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ll never understand how people Get mad that AI can do better than they will ev…
ytc_UgzSVBKD8…
G
We don't even know you exist
Oh, there is actually a LORA of your style
hahahaha…
ytc_UgxdM2f1b…
G
If this country can make (Ai ) can do any humans jobs than why they got all th…
ytc_UgzUcg_9j…
G
OMG I just made the link between the creep in Silicon Valley that "f*cks the AI …
ytc_UgzNx4NdB…
G
This information all begs a bigger question, what are the additional unknown fac…
ytc_UgxS8KKt4…
G
With how fast this tech moves, paying for individual subs is dead. using omnely …
ytc_UgyXUAW0b…
G
Right on. Even before AI, I read code a lot more than I write code…
ytr_UgxZCJYWE…
G
Buckle up, normies, it's time for an art history lesson from your friendly neigh…
ytc_UgwcVv9fx…
Comment
It is going to be very interesting and dangerous if we reach a point where AI replaces most human workers. If people lose their jobs they won’t have money to buy products or services. What happens to the economy then? Who is going to buy the food, clothes, phones, or even pays for TV and internet? And let’s not assume universal basic income (UBI) will fix it if it only covers basic survival it can easily become a tool of control rather than freedom. If we don’t find a real balance between automation and human purpose we won’t just face unemployment we could face hunger, widespread poverty, even civil riots and qide spread wars. Look now we are starting meaningless wars based on assumptions and no evidence. What is going to happen if people don't have money and food?
youtube
AI Harm Incident
2025-06-19T09:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy05clI0gQiaMtARZJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCWUiD54zhBytfWih4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAKEBG9v_v4BrphkB4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwuHdKE3SJzKNuWuS14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzblSATp53kryPPkzh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzNPT7kKXSHMlLQGzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwPqUfBCiFUujMVH4l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxhrLbIdJHa6ecPArR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSnnDx1iRR4dHFjGJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVqos0Bg6K84ZyhsJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"})