Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a weapon against sapience and everyone who uses it should be publicly sham…
ytc_Ugy-I1I05…
G
Robot : i want to see if it works on humans or not, gun 🔫 😂…
ytc_Ugy9CqRy7…
G
umm.. hank.. humanity just figured out how to plate silicon wafers with GaAs..
w…
ytc_UgzEsVP21…
G
@prettysureimhere Not because of AI, moreso because these people have been posti…
ytr_UgwT9uT3G…
G
ai is a tool to speed up development not to replace it manager bozos will find o…
ytc_Ugx73JSCO…
G
I don't refer to AI "art" as such (coming from someone who is both a programmer …
ytc_UgydUXoO3…
G
Just a thought, it's far cheaper to put a pair of AR glasses on a 'meat robot' a…
ytc_UgylmxYA0…
G
AI is crap. Narrow use product trying to pretend it’s something more. But it wi…
ytc_UgwEzniI7…
Comment
The letter calls for a six-month pause on building new AI tools that are 'more powerful' than OpenAI’s GPT4. Elon's FSD has virtually nothing to do with GPT4 or large language models (what GPT4 is built on). You cannot ask FSD a question or do anything with it other than navigate obstacles. The optimus bot would ultimately use features like GPT4 but is rather far still from that point. It would likely yet be a couple of years before Tesla would be pushing optimus 'beyond' GPT4.
youtube
AI Governance
2023-03-30T13:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugz8_D49DNulyYkez-x4AaABAg.9ns0FKOfwbE9nsJ45K7HRT","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz8_D49DNulyYkez-x4AaABAg.9ns0FKOfwbE9nsL4RLsDqH","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgzNYd6SczurtgcRORN4AaABAg.9nuWABWeys-9nuZ4JwI2vF","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugz3B6LWk0kUPYxTNWZ4AaABAg.9nuRzIKwM719nuUTuwh3J3","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgyRAcP3-osLT2MLvS54AaABAg.9ntPv_a4bDr9nu76qjALQI","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugw1NPXw4KKjsr3zRhJ4AaABAg.9nt4oSz9gfP9nu7UsF-0nI","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgxxjLKIq9nF6j2inTV4AaABAg.9nt43c8Xhsv9nu23sTrnOH","responsibility":"unclear","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwRba43u_waxtwIdBp4AaABAg.9nszddaaR_z9nt3lJnecEm","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytr_UgwRba43u_waxtwIdBp4AaABAg.9nszddaaR_z9nt6G4N3kff","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgwRba43u_waxtwIdBp4AaABAg.9nszddaaR_z9ntQAsCpiZV","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"}
]