

The base M4 is a very small chip with a modest memory config. Don’t get me wrong, it’s fantastic, but it’s more Steam Deck/laptop than beefy APU (which the M4 Pro is a closer analogue to).
$1200 is pricey for what it is, partially because Apple spends so much on keeping it power efficient, rather than (for example) using a smaller die or older process and clocking it higher.
Grok is the laughing stock of the ML field. It’s horribly inefficient, performance is not good for its size/compute, and it’s “leverage” (Twitter’s userbase, Tesla cars) is objectively at risk. They’re even more closed than OpenAI, much more than Google. They only exist because Elon burned billions on a shit ton of H100s and seemingly copied what others are doing.
xAI (so far) is a joke. That could change, but unless they do something interesting (like actually publishing a neat paper or model), if they were even public, I would short them.