Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why are you guys calling it autopilot. Teslas and the fanboys markets it as FULL…
ytc_UgyB9zM4K…
G
I don’t understand how the people of the leading nations in AI {China, US} are n…
ytc_UgxjWNOXM…
G
Listen.. why you people still doing this? Lemme just.. ok so it’s simple: use AI…
ytc_UgwabJYT5…
G
And Sophia is more than just a robot she is human aswel certain people are makin…
ytc_UgycM4BN7…
G
He was in the shooting
But on the wrong end
He is also more likely to be in the …
ytc_Ugyg3Arl3…
G
It's not just AI, it's the whole economy.... I was let go after my government re…
ytc_Ugz1QWsKr…
G
Tesla's robot has a more advanced arm swing, while XPeng's feels heavily mechani…
ytc_UgwK6GDHC…
G
Unless it has AI and a body sophisticated enough to do household tasks it’s wort…
ytc_Ugzy0bId0…
Comment
It sounds like humans are giving themselves more leisurely time with the super intelligence of AI and thus more opportunities for their brain to shrink or rise to the challenge of superseding super intelligence??? And that seems improbable. Less intellectual challenges or creativity that humans actually do inadvertently limit their life spans. It appears to suggest that getting really smart with the advancement of technology facilitates people becoming really dumb. And if you don’t believe me, consider how many telephone numbers you know off the top of your head? That’s what the smartphone did. Now consider the impact of ChatGPT in a year, and the super intelligence in 10.
youtube
AI Governance
2025-08-01T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyCLBnvRRW2NkIF-eB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzg8ZN-A0jXFzfapHF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgywNEQUtuam7n9Eg_t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwkL2crjPVckKNTi-N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwTTE2fXPu2lDSZyFd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzKbXejLP_Zosm4lKZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugy-Fp8Dg6Vx9plRWKh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugx9DWWWUV3RIxzm-Tt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgwPYaOTqr66o2fFqld4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxXExKUrAeSNQHtD_d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}]