Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, all that is only partially true, as up to now there has always been areas,…
ytc_UgyR7w1Ef…
G
What needs to be done is restrict who can upload videos to YouTube. Just having …
ytc_UgyZyW42C…
G
26:04 I agree. The only problem I see is that earth is cyclical. We are a part o…
ytc_Ugz-fVphf…
G
Hallucinations in LLMs are going to keep decreasing slowly resulting in this kin…
ytc_UgwQ1zsgj…
G
Me: goes to jail for slapping a lady dressed as a robot on the boonky😂…
ytc_UgwhbOYV0…
G
can´t believe people don´t get it? This is a very lonley and very rich mans toy.…
ytc_UgxLvzE0t…
G
@supremeclamitas5053I mean, yeah, but the ai was not incorrect about him getting…
ytr_UgzQ_nQ5Z…
G
If that self driving car was following any one of a number of "distance between …
ytc_UgjwZCpf6…
Comment
I think ai hasn't even hit the gas. Imagine thinking, "ill just gather all the fastest horses they can collectively run 1000's of mile in an hour. "
Then cars come along, and half a dozen can complete that in a few minutes. I think Ai is just 1 break through away from taking our estimates from years, to months. We are measuring this growth solely on human input. What happens when the machines are so much smarter they can improve w/o us at all
youtube
2026-04-08T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwORS6bixDwiHYVHTh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgziuAj3YB0sX72yLNp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzu6fiRByg4qq8OcxB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyrBG27yQhy8kc1VvN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxnk77FS-dRXH6D5Jt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw2weUm7X3Eda8lzad4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwwUchVUfyKBH9hzxJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwVoHNvAvQM5x5hR3h4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugywdyc5RyFLcnW0hgR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw_o_qu9-sPiJ5eyFd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]