Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have almost 1000 hours in testing , ChatGPT lin its emotional intelligence in…
ytc_Ugy9djmB_…
G
Stop using ai
Look it up
But it has something to do with your drinking wate…
ytc_UgxvRKmZJ…
G
Why did the man get in the middle of robot work? Of course, the robot took him a…
ytc_UgyMX6sow…
G
I'm not very artistically inclined. I don't have a natural eye for shapes or com…
ytc_Ugz_8B7Pq…
G
I'm not suggesting your experience isn't 100% accurate; but I will say that, for…
rdc_oi1jjem
G
Make robot 🤖 kill human, think you so smart, but every thing will lost control 😊…
ytc_Ugzu8FpW7…
G
No AI. It's an idiotic earth killer and job ender. The billionaires who are pu…
ytc_Ugw7KYgn5…
G
So basically all the creative jobs which require mixing or small elements will b…
ytc_UgwaenWCQ…
Comment
This guy is both a corporate shill and didn't say anything notable. AI companies are now asking for subsidies. AI companies are also not paying to train on "our data". This is the worst of techno capitalism. AI can only solve mathematical and well-defined scientific problems. It can translate languages but has trouble with slang and metaphors. AI can't and will never be able to analyse and evaluate with abstract reasoning. AI companies are trying to "fake" abstract reasoning through pattern recognition from insanely huge datasets. This is why a query is expensive because it is run through this pattern recognition engine. If it was an abstract reasoning engine, like a human brain, then the cost per query would be negligible.
youtube
AI Jobs
2025-11-18T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwyQ5aPVhfasN3xZ7d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxGNvZILsgqi1MnPGl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxfO3IiIv9Cyz5snc54AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzZYlC2qbHOa3jpgol4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx1afVWebG6BHO3dmh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyj5-vpuY-qe26D7fd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy7F0y58YTfaACFfNt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyD6DZxi-M_rx0xAyJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgziLd3iGEOE4zGG5mx4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwHgeB0taqE58qvbwt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}]