Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s always better to take inspiration from work that isn’t AI, or you can make …
ytr_UgzSVyBrd…
G
We need to develop space travel before AI. If Earth is lost, those in space may …
ytc_Ugzd3z-34…
G
This guy is clearly naive beyond belief about the implications of a singularity …
ytc_UgzgnQKiO…
G
I use Anthropic's "Claude-instant" and Claude-instant-100k and that bot literall…
ytc_UgzxxXT3P…
G
Right now all the LLMs do is synthesizer information. If that's all your degree …
ytc_Ugwf9e8Ri…
G
I'm noticing it's the younger rich elite that are participating in this for the …
rdc_esqbm2o
G
Eliezer had a great example several years ago (when GPT-2 was new) of where bein…
ytc_Ugz8jelfA…
G
@nobodycares607 Lol just saying a company will like a highly skilled and experie…
ytr_Ugzrfpa4U…
Comment
I think we humans are being too dramatic with AI. Throughout humanity, the driving force behind many ground breaking inventions was cost effective efficiency. If it saves time, energy, money, and resources, of course humans would do it. Although a lot of current jobs will go away, new ones will come out. When cars first came out and started to grow in popularity, people who worked with horses must've been terrified as their jobs were being threatened, but in the end, this new invention led to countless new opportunities.
There is no need to spend time and money on something that can be done fast and for free. Unless you want to be left behind unable to keep up, I suggest you start to adapt.
youtube
2025-01-23T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyGL_hkQYzE4DZ21SJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwN2VEc1TFiojgDh2R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsYXbb1XLkeiH2v8F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZP2f8v0_pSlfPSVZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz2bTk9MaTWuxix3ud4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwxmmtALwYQ92Ch8jV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwSet7r0T9Fcn9EntZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugywikv5t8TpcvS4UN94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoxvhE27C0fVvVt5l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyDh-hbJmVxX8_ZePx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]