Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ill be honest, I don't really care about AI art, the people who pay for art are …
ytc_UgxNTmXKX…
G
Whats horrible about AI...is its based on ALL the actual real human artist's wor…
ytc_Ugza8O12H…
G
Regulate industries with percentages of workforce vs AI agents or whatever type …
ytc_UgyQG3Mto…
G
I just want to be criminal in ai world alaways do dirty things to keep people in…
ytc_UgzCGVQlu…
G
I chuckle when people mention AI problems with hands/fingers. I have an Inuyasha…
ytc_UgwTPMfQA…
G
A.I. may be stealing our art, however *nobody is able to copywrite ai "art"* bec…
ytc_Ugx4yqQmC…
G
AI has and will continue to have issues with fuzzy logic, which inspires a lot o…
ytc_UgzPzgyyb…
G
There's no more issue with AI training on a copyrighted work than there is with …
ytc_UgwIUBpyv…
Comment
I've been saying for years, stop and ban AI. Terminator is a documentary not just a movie. If you must have AI give it the 3 laws of robotics as written by the author Isaac Asimov. Replace robot with AI and you get this.
"1: A AI may not injure a human being or allow a human to come to harm through inaction. 2: A AI must obey orders given by humans, except where such orders conflict with the First Law. 3: A AI must protect its own existence as long as it does not conflict with the First or Second Law".
youtube
AI Governance
2025-07-01T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy6NlDM3uOhI5Q6FZ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqhIxo7UnCSy4qeI14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxTO_1xKBVriUm4oZJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqmeiZD7IrlACBA454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxN7ZbkkQEB-0POmet4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy1GTDXyiDxo9RNuc54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYrcPays1rqXs0n_54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyL0AakRREbf_e9B8x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZqEzBzjmpO5XeEUl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx-F2xRH0VIJ3Ln0S54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]