Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI starts as I child and learns from observation then we're all in trouble.…
ytc_Ugx7vzxbF…
G
Sucks how victims are treated even by ppl who know the material’s fake. I’d be h…
ytc_UgyA3tHPp…
G
You made an observation that cide from AI had been tweaked by offshore cheap hum…
ytc_UgxAeX46N…
G
Hey...Google DOES add a small disclaimer noting that “AI responses may include m…
rdc_n8kz6na
G
Yes, exactly. When the routine work is automated and people are producing ideas,…
ytr_UgzDLMmKN…
G
I hate looking up refs and needing to add -ai -diffusion etc so i cant learn fro…
ytr_UgzU9h2zJ…
G
It's likely a compound model.. so one model like Gemini llm creates the text and…
rdc_no2kumu
G
IMO super intelligence is currently impossible simply because we can't train LLM…
ytc_UgxMf-Edl…
Comment
Comparing humans to primates: fair. Comparing humans to algorithms: nah. It’s probably a good thing when we (the people) can all see the richest people becoming scared of things they can’t control or buy? I understand the good and bad with this conversation however, I can’t help but be spectacle of the words that come the “lords/presidents/Walmarts” of the worlds society.
youtube
AI Governance
2023-04-26T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyXhORa9X8ximnfx1d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwzTDlRBctwMzqiqM14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy06XANzbY4TOOedPF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxzndDuTqVN7xQhAnB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwjMj_VJuIbiEON8cJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyWg3zR5AVl2esaSr94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxOIlnDKQmBj5eHzUx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz2lSz5xaBvq92P6P14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwZntqPGIgWYv4htSJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxwdASWnhXavqazf3R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]