Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Im from Pleiades Heaven when I see this it scares me because we had this on our …
ytc_Ugz0DkEDn…
G
uninspired cash grab is right. I was into cryptocurrency with bitcoin back in 20…
ytc_UgwIrSi79…
G
' No, I tend to disagree. I have tested my AI software. The AI , really sorry, h…
ytc_UgyEJVBqr…
G
They want to make sure they can use it for manipulation purposes and that AI doe…
ytc_UgxouBx-u…
G
I dont have an issue with tracing as long as its not someone elses orginal art. …
ytc_UgzTNWSqu…
G
All the YouTube videos about AGI or the impact of AI on society can be very inte…
ytc_UgyA-Q7k9…
G
This is the definition of entitlement. Good thing media last as long as it's sav…
ytc_Ugwmi4IN6…
G
lol AI can’t code. All it can do is copy and paste code that already been writte…
ytc_UgzqPZINf…
Comment
By using Ai for writing code, it is creating huge security vulnerabilities and making it easier for hackers to compromise any system, which in turn might take months if not years to patch up, by then valuable information is stolen. The second problem is that it takes a whole team of people to discover the vulnerability in the code, patch it up after months or years have passed because of the amount of errors the Ai is making, on top of that if the code doesn't work properly it ends up breaking your software program which cost the business using it millions of dollars in lost revenue because you are eventually forced to shutdown to correct the code and perform what is called system maintenance. In all that down time, company productivity goes down the toilet. You also need to train employees to understand the code and develop the skills needed to not make the same mistakes as the Ai. Whomever thought this would work out well is a moron of the highest caliber.
youtube
AI Jobs
2026-02-08T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz3VJ3ZILK8LYzQDO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxdY4RQ7h_w8GQLcht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgxAmaHjzBnUQL0nR5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugx0qzkjZFWTNu6mWvF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwG5idU0CUdNZJVfvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzJJc95CQWhp2c3byx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxJD-AjoST4KU8hKbx4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyVywl4WHdV9AQ8DNd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxwW08eM3pEi_impS54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugx0UgNv0R2WoMir4XZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]