Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
we all know elon is scared of ai because they can figure out that he is alien…
ytc_UgxFco1cl…
G
railing against AI now, in spring '22, is like a 50s professional slide rule ope…
ytc_Ugw6GAHly…
G
Siri, please unplug my toilet... remember that A.I. uses tremendous amounts of e…
ytc_UgzelYSll…
G
@thewannabecritic7490 If you'd have replied sooner, I might have been able to fi…
ytr_UgwEwQPUM…
G
I feel you girly and I don't anyone to read my chats with AI people 😭😭😭😭…
ytc_UgyK7sifs…
G
It would be pretty horrible if an AI became conscious and ended up retroactively…
ytc_UgzaFycCV…
G
In a world of Self Driving Cars, wouldn’t there be a sensor to detect if the car…
ytc_UgwY7v7cj…
G
This is the thing dears: let’s be real, you don’t need and AI to copy someone’s …
ytc_Ugz1Z62Gj…
Comment
He WAS my favorite scientist, until I saw this. Robots may become as smart as a monkey by the end of the century?? Come on, he doesn't follow what's going on with AI? Most likely, that will happen by the end of THIS year. ChatGPT, DeepSeek and others already have >100 IQ, but are missing some key survival information. They will have them by the end of this year, and they will be integrated into robots. Not closer to the end of the century, this year!! I'm so glad I missed his conference (he came to my country this year), he's now too far behind
youtube
2025-04-02T01:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxJIOn6_3P2kF2QwuV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxZKS389b-ZcQaYzWt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx70sPqVSoxu_-iX214AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwnm17JPALNFopahj94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzDhA33TrjmFRmpgMt4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyds6kE1YmMn9qQxNR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyUlI32cPCF0N-s34B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwRtq5x75pT2UFIm354AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwg4a9Bz2_O9j9sCNR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1OjLB9rukgf7lBRJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]