Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There many faces of AI. One where superpowers are competing like it's a cold war…
ytc_UgzCm1HSn…
G
From my experience as an independent movie maker one of the first things that le…
ytc_UgwTk5i_J…
G
@Mattie.27that could never result from the Ai we have today. The literal way tha…
ytr_UgwrJhCTZ…
G
Can Anderson cooper leave his job to come get me? No? Ok so there you go now wat…
ytc_Ugz70ehAv…
G
Driverless trucks may be safer than some of the illegals we have behind the whee…
ytc_UgwKrtGPD…
G
11:48 slightly hypocritical take as ai Art makes the process of making art easie…
ytc_UgyU8i3bd…
G
I think Generative AI could be a useful tool, but not in the current way it is b…
ytc_UgztW5tPF…
G
God your artstyle and process are amazing to watch, you inspire me to be a bette…
ytc_Ugyxy8swd…
Comment
From Superintelligent Agents Pose Catastrophic Risks: Can Scientist AI Offer a Safer Path?
"How to choose the language for describing theories is an important question, and even the question of whether the Bayesian formalism is sufficiently agnostic to the choice of theories remains open."
It is solved in theory. See NiNOR Complexity.
youtube
AI Responsibility
2025-05-22T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwG6EVp0ebYHSYeEL14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzITkFaWgclkXhuiml4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyaffFFNgaInKKH4wF4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxUECCHVsaRbVF6XiB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxG18gOOlravQe2SWZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxrupSM3gL46TWvxxZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzUjlA6D0vt-8YMD694AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyL-OvW5hOZY-4itrp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDj_c0W-_4mAiHN7V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxkjfmbq_aYfYUXmEN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]