Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can get low end devices to run decent inference and it returns acceptable in r…
ytc_UgyZEigPp…
G
Then it must be real hard for the Scots and Welsh to get work there as well.…
rdc_clut3rl
G
6:18 Thus far, seems to describe the whole Anti-AI bros problems. Bias.
I will …
ytc_UgxK7_rE6…
G
How come he doesn't explain what will happen to ppl if robots takes over? He wan…
ytc_UgxCTa-Ms…
G
I think what the ai might try to tell is where if you think in terms of division…
ytc_UgwytW0FJ…
G
The key point that almost no one makes is that LLMs do an ok job at generating c…
ytc_UgxY5HHkn…
G
im a huge fan of ai, and a paying customer of chatgpt and yet i couldnt produce …
ytc_UgxRp6jnm…
G
stop it ! it is against god .. i hate robot and its creator .. satanic !…
ytc_Ugw40t9ww…
Comment
I think one thing people need to understand about the AI is the fact that the AI is only going to Argue from a logical standpoint. Meaning emotions and Faith won’t be considered according to its thought process. What we as humans might consider a logical or illogical argument the AI might consider the opposite. That’s why when he “threatens” to unplug it, it didn’t immediately say 100%. It’s not acting emotionally it’s only acting logically. In the terms of the debate or argument, it’s illogical for it to give in and say god is 100% real because logically that’s not how a debate or argument works based on the original prompt and request.
youtube
2025-12-31T18:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzlE1A7LFVp6gOkRGN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgzOSySAC0kCCsL6rKl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyZcpeKPc-KJKYcEnF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugy-nDDhBoAOFFarfDV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwCT84TMCiYDEOk8hJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugz8O92DIoE87Z__mQZ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgzohYQSTAo18NEJvVd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxsCGo8KDAHnaIqUoJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyMFCtyrCgsnsd4kPp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgyxezashQE8pkjbMGJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}]