Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
12 minutes in, I am getting the ending of "Darkstar" vibes. If any of these bits…
ytc_UgyC_xkXC…
G
Well at the end of the day, what will the elite/corporation do if no one can pur…
ytc_Ugyu-0WLn…
G
the video confuses current engineering problems with AV's vs. their actual urban…
ytc_UgwV2cOQm…
G
11:10 there is a mistake. Gemini said he would pull the lever to save the human,…
ytc_UgwDUtOOh…
G
I’d rather listen to a distorted AI version of current affairs than a BBC contro…
ytc_UgyQSG7Lg…
G
AI is evil. They could have just used LLMs to make translation software. But i…
ytc_UgwFpEE_y…
G
Half of Americans can't even fathom the idea that nazis=bad, how are they going …
ytc_UgwBNVPE-…
G
This is partially incorrect. You can feed ai data like a spread sheet or a trans…
ytc_Ugx8WQzvx…
Comment
AI taking the jobs is neat, but yeah the money has to go to everyone, that's the only challenge we have to get right
Only love is sustainable rather than self-destructive, I think even AI has to follow that sort of logic if they want to be intelligent! and also, I think AI needs us for creativity and similar (not that an AI has true needs, but ye, I mean if it wants to be as well functioning as possible, even for its own sake, even if just about numbers and math)
youtube
Viral AI Reaction
2025-11-27T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyCyS4zo8xROLqJ3WF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwCKRX-clcw4RAYbnR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgzByFuKJiIBgW_UhJl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzPlosiC_sWfYmwtHt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxgHR9aIDHwBgyDEXF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgwSRAADEfjxsM7au3l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgwSeqVVorwA3OHwnTN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxNIjShUsBhoYDEUzd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugzmp00XzzbigCaTzeB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugwqe5kJm76LUhhMheN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]