Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank god i thought some fanboys were making eye bleaching ninjago dragons risin…
ytc_Ugzv5iDBc…
G
Can AI develop a mentality, beyond its initial program parameters, so that it be…
ytc_UgzrS-iMy…
G
So, basically, you have nothing. But need to make up BS in order to not let the …
ytc_UgymtGYNW…
G
The cost of AI is either going to 10x so companies like OpenAI can turn a profit…
ytc_UgxuEW_HD…
G
I wish I was a robot. I've seen them on star wars and dr who and they shoot lase…
ytc_UgxE3zRhN…
G
when the term clanker first popped up i genuinely thought it was hilarious and e…
ytc_Ugzy-mWip…
G
>Down to do whatever it takes, including direct client interactions.
"sorry,…
rdc_mjlhsdh
G
Why do you make it sound like kurrent AI could think? with the way we currently …
ytc_Ugxh1xi56…
Comment
I'm in the Kruger Park (South Africa) now (about to go on a game drive), elephants are plentiful. Seen a few large herds the last couple of days.
reddit
Cross-Cultural
1560656451.0
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_er9ydsi","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_eraz0qh","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"rdc_erahupo","responsibility":"none","reasoning":"none","policy":"none","emotion":"fear"},
{"id":"rdc_eraohdh","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"rdc_eraz0sx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]