Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is already out there, in the wild. It's already just a matter of time. Govern…
ytc_UgxCiwbiq…
G
well, the problem I see is that even if we are going to need one person to write…
ytc_UgwpfeMy6…
G
you need more subs. I went through a slightly depressed period a few months back…
ytc_UgzK8NwOo…
G
all IT systems fail at some point in time some are easy to recover and some are …
ytc_UgwSeqVVo…
G
We should ask super AI itself how we need to regulate it so that every human is …
ytc_UgwuPQi3K…
G
This is crazy! Makes me think about how advanced AI is becoming. I've been using…
ytc_UgxzduZXw…
G
@billiesmith1675 Yes it did. Look at Rosie (the sassy robot maid). We also have …
ytr_UgxewK9_3…
G
On this planet, we have over 12,500 nuclear warheads, more than 500 nuclear powe…
ytc_Ugzi60ERX…
Comment
On the contrary, AI, itself, is mediocre at best and it’s the data center infrastructures that have gotten out of hand.
Medium language models that run on a PC are the future and not these ridiculous data centers.
Our economy runs on humans working and paying taxes and bills for services. When that breaks down, society breaks down.
The hype bubble is about to pop…
youtube
AI Governance
2025-12-30T13:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxmwWdmWsCoiqrW8vV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZA6LpCzY0qgVEpEZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztmHgBjOczrrWOYRV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxVnZFjnyfbxliECj54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxd43qPFmDR65qeeyx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwW4WNC7tXrd6W17mt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwi7iX4IO8S2-xgR8F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzs5SVhrQCyDAaWV0V4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzjITuIl3XnKAqbCzN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgzpRC_HyfPlaK2JKnB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}]