Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even without the existence of AI, humans will create our own demise by 2050 via …
ytc_Ugwzc7_jt…
G
Remember that time an ai was asked to make a picture of a white dude robbing a s…
ytc_Ugw__5B11…
G
I have concerns about AI not being regulated. What if AI hijacks the banking sys…
ytc_Ugy8r0R4i…
G
Data centres & AI definitely has grave impact on environment n human health ...w…
ytc_Ugy_GV0Nx…
G
It is referring to that see the first 5 minutes when he shows the robot face it …
ytr_Ugw6UlcPW…
G
I do not agree. One robot that can do many things is better then making one robo…
ytc_UgwweOFvw…
G
The argument is not that superintelligence is possible via LLMs, the argument is…
ytr_UgyGzV4p_…
G
Openai is not my mom. I'd rather have the *option* of being unsafe and smarter. …
rdc_jg9zdw2
Comment
He is making the assumption that people will stop using their brains just because you have a new tool that supposedly does the thinking for you. It's a flawed assumption because using AI is the equivalent of hiring someone to help you out with cognitive tasks. The brain doesn't atrophy when you hire help. It just moves on to higher-level work on top of what people and tools do.
youtube
2025-01-10T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxjwGkXJCHojx6l6Px4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzG1TkRaYd3NxPKUNd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzY3eEPm4ESzxLiopR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyqy4Ni6-2mSUrYcTp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpJ9jQPc5j07ymnJ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf7iwgCY0YCwpt0ux4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu10uUY21QtOUUpm54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_FrUCcNZfTuyWDBd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgztYiXIGq6JTUzx2A14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgygV8AkId8WEIWi_oB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]