Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you find yourself nodding off during an apocalypse video, it's AI. AI voices…
ytc_UgxxxYXi5…
G
Thousands of developers are laid from tech companies now agentic ai is about to …
ytc_UgzX_QBxE…
G
I completely disagree. AI and robotics are evolving faster than most people rea…
ytr_UgysBttXj…
G
"I'm calling on behalf of Joe who recently received a new iPhone"
"I'm sorry, J…
ytc_UgyOeN1kv…
G
AI is simply pushing people to become more "spiritual", if I can say it like tha…
ytc_Ugw9uYEO_…
G
For AI to be understood the corrupt swamps in the government should be eliminate…
ytc_Ugy2EsgMo…
G
Kaku is wrong , it's not 100 years but in next 10 years. This is the biggest mis…
ytc_Ugw5u1CZW…
G
One day that robot is gonna turn around and put one in your head, you gonn say i…
ytc_UgxLopAuQ…
Comment
AI is a basic requirement for the exploration of the Universe because machines are resistent to dangerous radiation, and in particular can work without OXYGEN. The promoters of AI are also those industrialists who are very much engaged in space travel and the Mars activities. The problem to upkeep human life for longer periods on Mars for example is too extensive to be an option. They have to deliver enough Oxygen on a permanent basis which ist economically not justifiable. A full infrastructure is just impossible to support with present technology. And there come in AI robots.
youtube
AI Governance
2025-09-23T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwnLLWjr1DBxpLVfbB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwbk90fqZJ6mGXgcmh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyC_AvFqhEaFdVpEgl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx271OA8PvoSesF9fh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw-YHJ-kcywUnIFV9N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYiAie1QBqH2FSdeV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyLsmQI02XLDQGbDwN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxA5xzAdtLhEWeUZFJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxpxB0wKFeDAwGlZZd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxNyKfC96k-nnF6bgp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}
]