Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But how will they destroy humans. Like a 22LR could kill a robot easily. So shou…
ytc_UgwViQRKK…
G
A lot of companies are gonna pay us to use the AI rather than use it…
ytc_UgwGPA9J4…
G
@Sabine, next time you do an AI video reach out to me. I'll fact check your scr…
ytc_UgwifjdA3…
G
This mindset is at the helm of the most important AI company in the world. We ar…
ytc_UgwLEszCE…
G
All the AI robot need to have a emergency shut down button hidden somewhere on t…
ytc_UgwgrHq7H…
G
This question has been answered in the past. Self-Driving cars will always have …
ytc_Ugg_yjdSa…
G
I discovered my chatgpt was being undetectably indulgent even when I'm always fi…
ytc_UgwpEU7Oh…
G
😂😂😂 we don't care once your robots are in place to work for your business. I as …
ytc_UgwEKWnjz…
Comment
This is a believer vs anti-theist argument, not a believer vs atheist argument. Please look up the definitions and re-do the experiment, as that would be far more revealing from an AI LLM debate. That said this was an incredibly interesting thought experiment and I congratulate you on the amount of work that must’ve been put in to each of these LLMs in terms of the prompts and the knowledge base. Well done.
youtube
2025-11-19T23:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyKxoohyrIjGRhD48F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw-0jXEuO3kxGIBiax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgzLQREfFS_Tj8gA-5p4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgxSDQJJRVlXb3dKGiN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_Ugw84e2MpH8atCFs0B54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgzqUNp-KI0sYRSWy0N4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyG29oP03fIftOsWnp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"ban","emotion":"outrage"},{"id":"ytc_UgzmCwzn4ZnreMV7cK54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgxDf1XBVlNS7Dl71Ep4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugwx73kMIFGonpA5V3F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]