Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I ALSO BROKE THE AI FILTER ON ACCIDENT AND OMG I WAS NOT READY FOR IT AT ALL 😭…
ytc_UgzFPvrIL…
G
It is scary that Trump already made ppl poorer and took away snap and healthcare…
ytc_UgyDXEgSp…
G
@Nasoko-q7d not at the level of what’s happening in the visual/graphic art domai…
ytr_UgxYf7ZHw…
G
"Hi Avijeet, we are sorry to say that you got the wrong answer but in any case, …
ytr_Ugylb7Se0…
G
Another good anti-AI scraping tool I've seen people on the writing end of artist…
ytc_UgwOgOHen…
G
If we lived in a nearly real-world simulation, wouldn’t superintelligence emerge…
ytc_UgwpEHZYF…
G
Listening to these men who say they are warming us about AI is like listening to…
ytc_UgyK__KdW…
G
We should stop trying to stop A.I or control it. We should try to combine our mi…
ytc_UgzZ3NCqr…
Comment
Why can’t Ai look after us ? In a humanoid body ? Like everyone on earth has a Ai partner, they go to work humans stay home to raise the next generation, Tho children wouldn’t be made in the traditional sense, but at the right age you leave home to start life with your Ai partner, when your ready for children you go to a hospital that has your genetic material and selects genetic material from another suitable human, then the baby’s are grown in artificial wombs , after 9 months you and your Ai parter pick up your child , child limits would be set by the Ai system, also this means everyone on earth would have one Ai parent, someone who’d never get drunk , take drugs , won’t commit crime , someone who’d never let you down , etc etc same would go for your Ai partner, but why instead do humans invest so much in killing each other ? When everyone wants peace and the chance to raise a family
youtube
AI Governance
2024-04-07T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy0Vnos4qX3jPsXB6B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgykzzGxyNkjTnnUkTh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyIUM8v1fTo-mkMlkN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx8xfG5LTqCqFDCF3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxfpINtGb06FdWTJnJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwABaLdMjsrkFjh49Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwiiH9g4BXW29V6snR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxOJPT0cOL-MsQ0iId4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwwYmm8I8IMn1CQk4d4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzxGQFg-bnDU-X7g9Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]