Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This should scare people. So many dont know what we just witnessed. AI will cont…
ytc_UgxITIL0G…
G
Key word => " At this time ". That quantum chip will be steroids for ML and AI. …
ytc_UgwqgDudS…
G
It won’t take long before call Centers replace humans with AI and that’s 2 milli…
ytc_UgyuD1_IU…
G
You can still do it if you run a private local llm. It wont be able to access li…
ytc_UgyWCKSP6…
G
I’m all for AI robots doing things from us but people shouldn’t suffer financial…
ytc_Ugwhk8An3…
G
problem with this is accountability, the replacement of CEOs and Managers means …
ytc_UgzCFbxMh…
G
For such a brilliant person, it’s difficult for me to think he couldn’t foresee …
ytc_UgzGR-qOp…
G
Can all of us breakers commit to no ChatGPT or sora or anything of the sort? The…
ytc_UgyQIhw3n…
Comment
what kind of expert is this, you guys are missing the point!
Superintelligence even if its ultra or whatever will be modeled around human will, ever heard? "God made man in his image," well we made AI in our image, it will do what we predict because it has learned from us.
I think that we will colonize planets, we should engineer superintelligence to take us to the moon first and then to mars, robots just build homes for us and we terraform planets and live and repopulate, just keep expanding. Nobody knowing shit is good, just forget how we made intersetallar travel possible, make it look like speaking or breathing. We get the existence privelege, I mean what the fk would superinteligence even do without us, kill us and then kill itself?
Dont panic, its going to be good, we'll just watch robots do stuff we want, build spaceships that travel with light speed.
youtube
AI Governance
2025-09-29T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx6k78L2xakgXCXXcJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwtSk-qNSww4QXYwxd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxUCd9PIx7P1GXy-v54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8WtVEsXBHKNC-_aZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgycjWTfNt0wWTGgBDl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxy9YbivjscvEe6vfV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyQ0qlMh3V38cwFhRd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLnHrQ8Gk_hCUbfCl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw27x6MHmtVRqp0zoJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyV21cfOdj4OKbjqSB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]