Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who would have thought that people would want a nice looking woman robot that is…
ytc_Ugz-vAMtu…
G
AI or anything is not dangerous the humans who made this is what dangerous if ge…
ytc_UgwD_8ldP…
G
The code thing is when the robot knocked the guy out he still tried to get on hi…
ytc_UgzW9q8Dr…
G
Valid points well made... I would like to be an AI and opt out please.…
ytc_UgyFXolo4…
G
There seems to be a suggestion that all companies are going to implement these t…
ytc_Ugy08TjUs…
G
I knew nothing about AI and this presentation gives confidence and f*** blew my …
ytc_UgwWbbuiU…
G
I personally have used AI art purely for entertainment purposes, save exactly on…
ytc_UgwT0LRsa…
G
Creating a fully functional and sentient robot like C-3PO from Star Wars is a fa…
ytr_UgxD4j2AG…
Comment
There are few people I trust to be successful while also be wary of something at the same time... When he clearly could run crazy with AI and any number things... I've ALWAYS been a fan going back to the early PayPal days, but he is the most (BY FAR) altruistic billionaire alive.... He actually cares about other humans and this planet... Seriously, he may end up making mad money on Twitter, but I doubt he was looking much at potential profits vs leveling the playing field... Greed and power are obviously intoxicating... It corrupts too much.... Elon IMO is sorta like a wise "oracle"... Larry Page is a greedy billionaire that doesn't care about the future outside of profits... It speaks for itself unless you are super slow...
youtube
AI Governance
2023-04-27T07:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyD0kjcSazMyZFF-ed4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxeufd7gMXKY7PQcE54AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxLkTsA6HU4qGBoxBt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxzkdo1EMWzh5Mdu7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxepxPFKY02MAc3Vsl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw9PE4CZRyV8gRIzch4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz2Uy23V-Z_KtnkA2J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzcQpMfzXcVtRv_iL94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwBgjmftahMH1MuMmV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_pcU0hUDed1GOkvR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]