Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wanna know how to get it illegal in every state over night? Lets do a collab eff…
ytc_UgyhOYlI1…
G
This is not right. AI needs to be capped and controlled, we are brewing a self d…
ytc_UgyTD0De3…
G
I used to have Eliza for my Apple IIe like 30 years ago. If that's what your ai …
ytc_UgxI_qVAS…
G
@jamesmccoysr.3595 Care to explain how this is a race based protocol? The 2 poli…
ytr_Ugw_e0Wf5…
G
@Pixel4ted_conf3tti to be honest true im sorry but it does kinda benefit me i s…
ytr_UgwlE7KJh…
G
Thank you for the explanation. So, if I wanted to protect the content on my webs…
ytc_Ugx58tFhc…
G
5:15 Ah yes, because AI famously only comes up with purely original ideas and ne…
ytc_UgzSYF4nw…
G
So, basically this is what's going to happen. Everyone will lose their jobs, the…
ytc_UgzAK78cd…
Comment
Without doubt the software has the ability to drive as a human would on the slight offensive side as most humans drive, however as this is rolled out, without doubt safety and minimising incidents is the policy makers and software engineers priority. As more of these hit the road, and the numbers of minimising incidents prevail, slowly but certainly the self driving experience will be enhanced.
youtube
2025-10-29T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxW_uYZNDumjp2AVKd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgygIqAPyMNiFo27jnp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxHCQm7yDjMR7Typ0l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyJKOytV5Cmo4va6VV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugw1-n9WDapw4QTwha14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx9HfFPDzSyc1Q7nkl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz578HlyyjzdUllWiB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwnX8f8uRW2J50QvMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdfAN_jGciW6KUF0Z4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzHfDoFGfjn3j0ZK7J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]