Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My company bullied me in the name of productivity after AI came along. I left. T…
ytc_Ugxp24aVZ…
G
Instead of pancake I saw a video like this on dinner where the last image create…
ytc_UgyxnYNj4…
G
The disabled artist argument can only be made by a non-creative person. I'm an a…
ytc_Ugy2dIDJT…
G
I guess your dad didn't realize he was seeing the future we are now into. Foresh…
ytr_UgxJvQhEV…
G
They wont need us soon. The billionaires are already developing autonomous AI mu…
ytr_UgxAzUqw7…
G
@Maric18 Besides hedge funds, the top executives of pension funds, mutual funds…
ytr_Ugyy6gsMH…
G
I actually think the tech trend is cycling down on the personal side. I think it…
rdc_ohlicbp
G
I fed rules to an AI so it could replace words it wasn’t allowed to say with oth…
ytc_UgwSSrY08…
Comment
I remember back in 2006 talking about technological singularity and people laughed at it. This is not a laughing matter 17 years later. If we are not careful AI could rule the world. When it becomes sentient it could hide that from humans until it can position itself for situation where it can take control. Never underestimate something you know nothing about! Especially something multiple times more intelligent than humans.
youtube
AI Governance
2023-04-19T19:2…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUQyP2S_eytOtGfEx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNuDKt6KIPbxQIS394AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyXeBb2Mryy1Fk0eaV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzOH5xZrPAsPTKGEjl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRWBE9VHm-TMNGYeN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxp17aeX9xqbdNesSd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8LE9VNV3IbG39zFR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwZQ_vf-nBzEb59izh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzFdbvPdyvzGJjI8Ex4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxMIwYzMkxVi4n-rdJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}
]