Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's a logical fallacy to think any human would match AI at any kind of computer…
ytc_UgxaDwVwL…
G
😮 that moment you realize all the b.s. lately is them setting up for the rights …
ytc_UgxjZ9gbP…
G
Bernie really needs to shake up his “billionaires bad” playbook. It’s tired. ALL…
ytc_UgxgUEw1d…
G
I still remember when we were talking about AI having soul in the time of lemoin…
ytc_UgzhtHBZj…
G
Been using AICarma to monitor AI trends; it’s really helping me adapt my brand s…
ytc_UgwoA_L6f…
G
You do know these are sex toys for the wealthy. Otherwise make them look like pl…
ytc_UgwQ_FB0j…
G
For OpenAI to call their agents "AGI" is like Microsoft calling VBA macros "AGI"…
rdc_n40qmnn
G
It always disappoints me when a person in a position like Sam's, answers "what w…
ytc_UgzlypR29…
Comment
1) Train the AI models on contrast and you'll get contrast.
2) AI might not get it right the first time but perhaps after the millionth time. The old question on how many attempts would it take a monkey to write a Shakespeare play. AI generation speed can allow it to try and try again repeatedly.
youtube
2024-05-25T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwRLC71MD_IY1Clwat4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugwa14muB0_-Y35SmYd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgxDSVrksZ1pMPW0Ash4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgxXvTKXMyppon8KkO94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgxiOJlPDBkqFzhc4Nt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxXQr3l1_FDpZydTo14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgxNcyFpioXUCZ4rGqZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_Ugz36toA01nffbM1bBF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgxMomLUdizfBdBwiHl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugw3Gy-QJL4sr7xUHAN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}]