Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As someone who actually wants AI to experience it's singularity moment, if we in…
ytc_UgwPZcsw1…
G
Yes Because everything else in the world has changed in the last 100 years but t…
ytc_UgwJTHago…
G
I'm agree that we should stop working on ai and shut down completely, but unfort…
ytc_UgyIu0fks…
G
I think there need to be a strong definition between different kinds of tracing.…
ytc_Ugx0VDjGF…
G
Ting is no AI has passed the test yet. The whole idea is a hypothesis. Not even …
ytc_UgwE_o5Et…
G
@Carpetdandysworld You mean you can't draw on top of AI art and make it a compl…
ytr_UgwnO0_n6…
G
"Is this not an AI image?"
"Yup!"
"And this is an AI program"
"Yup!"
"You us…
ytc_Ugwcfyq6l…
G
Ai will never „transmit feelings” because it’s not art. There is no thought behi…
ytc_UgyIFDL99…
Comment
Do you not think that AI would be smart enough to know that if it kills humans it would have electricity untill powerplant, grid and electric instalations fail due to sudden loss of demand and not having humans to actually mantain it if. We were always smarter than horses or oxes but we knew that we wouldt get far wothout them.
youtube
AI Governance
2025-06-27T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxJO-QVll_iexAsiYR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgztEGX6UuOKGEGyXOh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugym3FHA5CgmXpQKLdd4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwzAakxNAzFVQo26tl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxg2CzZ2GSRistsecx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz0VAp--8pu4Cm3XzJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwLKKrN2wWeh_JBhqF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy35GzvJrDNpbCsVd94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy1tjUaOr_vQXTWfcZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzdz4atWQhxgjnmtm94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]