Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI can be an amazing thing for humanity in terms of equalizing all of us. Will w…
ytc_Ugyl_Mt71…
G
I did this before but i think it was the ai playing along with what i was talkin…
ytc_UgwI0m1VA…
G
All I'm using AI art is for now is to create random stuff, cause I can't draw. I…
ytc_Ugx5D3wxP…
G
Real-estate crash downturn expected Bez of AI disruption and high unemployment a…
ytc_UgxyOa9pZ…
G
Can't we just go back to flowers and trees and get away from all this investor, …
ytc_UgzHaC7AP…
G
I've been informed we have nothing to worry about. The robot camera sensors are …
ytc_UgiePTNfa…
G
Exactly, the "real" influencers lives and physicality aren't what they appear to…
ytr_UgwDXePDb…
G
@Rina_GamingYtno I’ve tried writing random non ai sounding things and it still s…
ytr_UgxkdYdZu…
Comment
Google's LLM ignores exact match to the input and provides what it "thinks" you "should be" looking for.
simple example. I asked how old is Dolly Lenz (She is a high profile realtor to the super rich). Google retured: "I think you are looking for Dolly Parton." Ignoring direct string match in favor of "POPULAR" responses is not intelligence. It is marketing.
youtube
2025-12-29T19:5…
♥ 12
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyrzCYQ_xfGOZjdETh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwNR0pu_6IxR3-PeXt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwoSI3nf3NnnEhrfo54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx9V45DbAOfZ6mTYQB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyZJZ177NJNbJEJ23V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxEuUPMgBbxGnYBQBB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw1Hh7h3dme67ZJcst4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzs8Dq5lZcO2ecfpFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyECy1JrJdYgzogrut4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwYFNNcFnPBkw08JHx4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"}
]