Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What have the movies been illuminating about artificial intelligence? It is iden…
ytc_Ugx-u9CPc…
G
Honestly! This is the DUMBEST idea EVER. Ok. So at 1st it will work out and be a…
ytc_Ugz0NKiAa…
G
Ngl ALL AI art is stolen not a single thing created by AI is original even is vi…
ytc_UgyinKKzy…
G
George Orwell's "1984" & "Animal Farm", et al + Facial Recognition is now Global…
ytr_Ugw3idzM8…
G
Shallow people like shallow things. AI stans wont go away but i think this makes…
ytc_UgxTEEEg1…
G
The only way to atop A.I. now from taking over humanity is for humanity to elimi…
ytc_Ugz4UhnaS…
G
Ohh robot baseball seems like an awesome competition for humanoid robotics, If w…
ytc_UgzlW9n9R…
G
Step 1: Buy a big screw
Step 2: Put it where the robot will have to step on it
S…
ytc_Ugy69Bt92…
Comment
It is quite possible that today's AI is very brittle, and when it gets to inputs that vary from the testing data enough they behave in unpredictable ways, and possibly break. That it is essentially autocomplete on steroids, that cannot perform chained tasks with enough accuracy to replace most office workers yet. As computer usage gets cheaper and models get better, and computers advance, and as models improve AI could become a real threat quite rapidly. Today's AI is not quite there yet, so either A. AI in its current forms are a dead end, or B. we are a few breakthroughs away from AGI and ASI, and we have time now to control the course. Of course if we start WWIII over Taiwan or Ukraine, or some other thing, EMPs will knock out robots, chip fabs, and servers, and automation, and none of it will matter anyway.
youtube
AI Harm Incident
2025-07-27T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzs3aKCBbofxu9Acbh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzFtpLDIt_YSrsXh4h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxqLzaH3eG7AG3ejaF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy3NJeyByYm58n8O3p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1zKNZzcDNr_AyNT94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNcV2SMpSmBWypA5d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_Ugw09X2g8mWdIXXA-Qp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBixCMfF7C_Lad8UR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw0CpEOAdczNcmqmPV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx2vf9Wb3HRp4vZnWp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]