Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love his arm broke like 1/4 of the kids at the first day of school in 2008…
ytc_UgxCOZu14…
G
I want to talk about this but I'm going to share it with all you guys even him d…
ytc_UgxIrOxdc…
G
I swear I didn’t convince an ai to kill itself! I didn’t annoy ai to the point o…
ytc_UgyTGmAeQ…
G
Leave it to the woke left to claim that a completely logical and unemotional pro…
ytc_UgwhH3kTT…
G
We shouldn't focus on advancing artificial intelligence when we're still dealing…
ytc_UgxhYt-Nl…
G
Are Americans just loosing braincells or something? It is plain scientifically …
ytc_Ugxp5Q0_X…
G
I'm sorry but I can't stand this argument. AI is unethical and terrible even if …
ytc_Ugxk0BZr6…
G
There are real artists who use gen AI, for disability and other reasons. In the …
ytc_UgztwEZIH…
Comment
If computers are thinking, they think at exponentially greater speed than humans. Therefore, their perceived passage of time is very different from ours. Just look at the experiment where the two LLMs made up and started conversing in their own created language that was much more efficient and faster than human thought/speech. For a computer interfacing with a human, it must be like listening to someone talking extremely slowly and in simple language that barely contains anything like an actual fact/statement. Man will have to find a way to direct interface with AI and even then, I'm not sure we could keep up with the informational flow.
youtube
AI Governance
2025-07-23T15:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzL9KB4tn97D54J6vB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxcQAaq122H9xEWlUd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzaeSrr657YPzaFc_t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzoSrYNjeVptPWOJQx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzY6AtOExUicUIzOrB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzmhGeHHgWawhwko6B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwwGucn6bpQ5JA-Wx54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzuTWGGa0GiSYfb9ft4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzrpKv1dMX3iZy5rQd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzgoH75qaFuEY5r0TF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]