Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They just don't want us retiring before they can replace us.
AI isn't there yet…
rdc_l56estu
G
Technically, that’s not allowed: the operators can give the way a command BUT th…
ytr_UgzSHqKDA…
G
User error....my sorority sisters and I rode in one 35 minutes to the mall with …
ytc_UgxTxYvZm…
G
It's what these companies count on, opting you in and hoping you won't opt out. …
rdc_m7g5er9
G
In a vacuum, I have no problem with Ai art, but technology isn't developed in a …
ytc_UgwBijU0p…
G
even after using gptinf, Winston AI often still detects that ai feel. helpful to…
ytc_Ugw80uNbw…
G
I might die in a motorcycle accident, but I'll be damned if its by a robot.…
ytc_UgzpjKvL0…
G
Broke my dominant hand and had to do my final art exam with my left hand, I put …
ytc_Ugy4mzWjx…
Comment
👋 AI PhD here :
"We've never had to deal with smart things than us" : we can't be so sure, as our intelligence couldn't be developed enough to interact consciously with a higher intelligence : it's very much possible we can't see or comprehend what's more intelligent than us. If so : why would higher intelligence bother look at us and make us 'understand' they're there ? When's the last time you thought about chatting with a worm ? Do worms think we're beings, or just parts of the weather ?
Maybe we've already been "dealt" with by higher intelligence, but we don't know it. And maybe AI will be so intelligent that it will find a way to "disappear" : we won't know what it will have become, where it is, what it wants. We may be creating super-intelligent, potentially invisible creatures...
youtube
AI Governance
2025-07-13T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugza10P-HWB5qEG2UUt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiXRu2_8wPp-gNE9x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLLRZG1RWsseKwv8Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwnip6DFqRyIVQWtAV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwwMfafSVtUXMk0poF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-xHtqqtz4e4aEnuN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgziEw3_Q1HQI3YEpbd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFqRwPGF_p6Ixidfl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4f2sFCRHiQmihpzV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwmam9x38wfiXh5Ezt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]