Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Como lo he gozado por favor! Los idiotas se creen genios sin siquiera pensar lo …
ytc_Ugw-JNcYd…
G
To me affirmation of we need regulation = we have the legal background to implem…
ytc_UgyYmMtra…
G
This video has become a regular re-watch for me - it's a wonderful rebuttal to t…
ytc_Ugw-xjvKF…
G
For millions, daily life is an existential battle - for survival, yes, but also …
ytc_UgyXG9ZTr…
G
You seem to be mixing generative "AI" and machine learning (ML). They are VASTLY…
ytc_UgwdcFWm1…
G
Wrong, you are just draining in copium here. AI will 100% replace most of the jo…
ytr_UgySkn9qD…
G
May be all of us, individually, are... AI.
It seems I have been "programmed" si…
ytc_UgybH_pOW…
G
if the type of ai you’re talking about is what I’m thinking of this is because t…
ytc_Ugwit7ej6…
Comment
It doesn't matter how it's smart all that matters is capability at the end of the day. If you had an AI that was composed of 1 trillion specialized tools and ai models that could do nearly any task a human could do , that's an AGI. It has a collective general intelligence then formed from the composition of many things similar to how many humans in a corporation that are specialized can come together to produce products and services no individual could create. So you might have an AI that's like a corporation with no central superintelligence but it's so capable with some many abilities that it might as well be and you can't tell the difference
youtube
AI Moral Status
2025-10-30T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy4XdRCSw5Q915gtet4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxu4DLjSvY5kRguRht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzHpOsVSmPePEy47IB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyL5MLar7cpM6sdZsh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxQChv6c3PLww1_tHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEU3QURvBWcQoGWNd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwxzJT3tYpL_wp20bp4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw5AuIx_z4n1SpCpbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxRF5zFkm0QpxMwX6R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyG8nYbYW1nj2eDzCZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]