Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What is that for a nonsense ? AI is not intelligent nor sentient and not consc…
ytc_Ugxk5Mz8p…
G
Hey guys, so AI is gonna fuck everyone up, you're gonna be poor in 2 months, any…
ytc_UgyOvc_tt…
G
0:40 bro in 2016 I was 10 years old and a teacher was dealing with a similar stu…
ytc_Ugw9Xz3uM…
G
That's the very reason why I'm very well mannered with ChatGPT. Always please an…
ytc_Ugw4eaT4o…
G
Thinking about singularity… I work as a psychotherapist and often get into discu…
ytc_Ugw0KGDga…
G
Do you not think there is a substantial difference between a human having varied…
ytr_Ugwiu7V3E…
G
This video is strewn with conjecture, hyperbole and just plain bull$&@?. Don’t w…
ytc_Ugz1vqRKD…
G
I no longer use AI and I actively search with -ai. BUT I used to say Hello Assis…
ytc_Ugz5Geko4…
Comment
One day one ai will bully the other ai and the other ai will create and world ending bomb type thing but no ai will use it becsuse they will not be able to survive that bomb..now nuclear bombs..they will be throwing those like grenades...who care about humans
.as long as they program ai to be as much like humans as possible like always changing out minds....will it be able to do that..its oretty amazing...and thats not a good enough reaosn for it to be created...it would be amazing to be in the middle of outerspace...but not smart
youtube
AI Governance
2024-03-10T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwgTVVGBzePrGuXnAF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwf1zV2WNMtrOo7vEZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOqgMOA8kpP012hwl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxhfrJ__CoBWmQDCdl4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLE1dD0N501VbSkiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMAzXW1KTQ2I0lnpV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwTnOjBTWPmzQTCALB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw4qDHjs9hRXvZzH1R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyieeQ7qi3FITu3Bh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwORJR057wCeQKykJx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]