Yes, companies shouldn’t have to endlessly seek growth, but it’s often what shareholders/the market expect and demand.
Oh, but I definitely want more multicore performance out of my machine. Single-core power is seldom needed beyond 3.0GHz nowadays.
I’ve read benchmarks and I know the Xeon W will be surpassed in a few years. But the big L1/L2/L3 caches do make a ton of difference in certain workflows.
But as mentioned above, I don’t disagree there are other very solid offers. It’s just that I feel that the iMac Pro is something that can last you a long time.
I think @hubertlepicki and others underestimate the insanely fast and powerful components Apple often picks for their higher-end offerings. It’s no secret that 2017’s iPad Pros are more powerful than most laptops out there (save for the dedicated gaming ones).
I’ll say to him and the others again: I agree it’s expensive. Most of us can’t just shell out north of $13,000 and not have our savings affected. But it’s not so much more expensive than everything else as they claim. If you compare a Samsung 840 EVO SSD with a modern NVMe drive that requires 4x PCIe lanes to work on full speeds then of course the differences in price will be noticeable.
But I grow tired of arguing. I simply disagree with people when they conflate “alternative is good enough for me” with “Apple is overpriced”.
L-level caches, man. They do make a difference.
Possible but unlikely. Both computing and tech prices are plateauing no matter what price tags Apple or anybody else wants to put on their tech. Keep increasing the price, people will just stop buying.
My point is: I don’t intend to simulate VR universes. Whatever I do nowadays I can keep doing it on the Xeon W 18-core / 128GB RAM / 4TB SSD some 15 years from now.
But I won’t argue this point because truth has often been stranger than fiction. We don’t know what is coming.
I agree. They and Samsung keep pushing the envelope by making their flagship phones a little bit more expensive every year. This will stop pretty soon (it might even have already started, look at the iPhone XR for example). First, saturation as you said. My mom tells me to not bother buying her a new phone until that one breaks (iPhone 7). Many others are like her. Second, the added value you get from the flagship is dwindling and has been for a few years now. For example the only thing that the XS Max is tempting me with is the bigger screen. But if I turn a blind eye to that then the iPhone X can and will serve me just fine for years still.
Not exactly the same with computers though. For example the normal iMac 27" 5K still has i5 CPUs in most but one configurations. This is ridiculuous and is a scummy way to eventually upsell better tech. But people like you, me, and probably all of the forum here are technically educated to pick the right option.
Apple’s tactic is hold on to older tech and sell it for today’s price for as long as possible. I dislike the tactic but I can use my knowledge and Google-fu to separate the wheat from the chaff.
To me, the iMac Pro is the wheat and the iMac 5K with i5 CPU is the chaff.
No corporation would ever stop until the market corrects it. Even governments can’t stop them – if they devise a tax law or a limitation of any kind then the lawyers and accountants of the company will find a loophole for a few short weeks in a regulation that the government might have spent years to craft. It’s always been like that. So until people stop buying – or the amount of refurbished tech, or the amount of refunds and returns increases – then Apple and any other corp won’t ever stop.
Still doesn’t mean they don’t have good products.
The iMac Pro for example is not designed to extract a huge premium from you – people have compared the prices and whether Apple opponents like it or not, the price is reasonable for the tech you get because you get server-grade components, extended warranty and world-class professional display. That particular machine is designed to lure you in and vendor-lock you.
Not all profits are financial.
But yes, I agree that Apple is not doing things in a customer-friendly way. If you are not technically educated you will likely end up paying a lot for underperforming tech in an awkward setup (like the iMac 5K with 8GB RAM, what the hell?).
Noticeable, yes. But OP was claiming much more than that, that there are 4TB SSDs “way below” the $1,000 price range, which appears to be completely false, even for the slow ones. I find that this is typical for people who bash Apple’s pricing, that they often go beyond not pricing comparable components and also use imaginary prices for those not-comparable components.
That’s what I am observing and it’s not only on this forum. It’s not constructive.
Building straw-men so your emotional (or not well-informed-yet) argument looks more legit is very normal human behaviour. But it doesn’t help anybody to get to the cold facts. I am ashamed to admit I’ve done it many times in my life. The struggle to never do it again is still ongoing…
Basically the response I got earlier was: “the components might be similar prices to the competition but I don’t need a professional-grade 5K monitor so you know, the iMac Pro is overpriced” – which I keep repeating is not a good argument. And I keep saying that if the alternatives work well for you then okay, nobody is arguing what you should like or use, of course. But it’s not a reason to claim that which you don’t use is overpriced.
I don’t need a private jet. But I don’t claim the private jets are overpriced.
I quite agree, that is why the AMD chips are so nice, they have SO many cores, slightly slower sure (still well above 3ghz), but it has so many of them!
The new chip coming out next year looks espcially interesting, AMD is enhancing the IO bus, has a dedicated IO chip on the CPU (14nm) with 8 8-core 7nm chips surrounding it with a lower latency than their prior chips, so that’s 64-core (128 ‘threads’), it looks very nice (though this is enterprise grade of course).
I think they just leaked info that the prototypes for these exceed expected performance, and that they will raise the original performance goals to take full advantage of the 7nm architecture. Of course, this may be just intentionally spreading gossip for PR.
It certainly could be, but if you look at the past couple years’ history of what AMD claims in advance vs what they deliver, they’ve been really solid–while during the same time Intel has missed over and over and over…
I’m moving away from Apple equipment in general. Mainly because I expect my machines to last more than a year or two. Every release of MacOS or iOS seems to require more resources to run until the machine become useless.
I have an old 17" MacBook Pro that I loved that is unusable with MacOS Mojave. It’s running El Capitan because anything later runs like crap. My iPad 2 refuses to upgrade to anything after iOS 10, which is okay since it doesn’t run well under iOS 10 anyways.
My work system is mid 2015 MacBook Pro and the upgrade to Mojave was okay, but after spontaneously rebooting a couple times while I was working I’d had enough and installed elementary os. Everything feels faster and the elementary os is enough like MacOS that other than retraining my fingers to use Ctrl instead of Cmd it’s comfortable to use. The only thing that didn’t work was the camera.
Oh, and if you play games the Mac is just a bad choice. (I foolishly tried to play Firewatch on my old Mac Mini with it’s 0.2 fps refresh rate. )
So, no Apple for me. The value for the price just isn’t there.
You should also take into consideration that with the new Zen architecture, they really do deliver on their promises. Since first Zen chips, there is really no point in buying Intel CPU’s for desktops, unless you really need that few frames more in a game, but then again if you want to do something beside gaming at this time (like streaming to twitch etc,) then Zen chips are still better.
Since late 2011 to till late 2017 I was using hexa core AMD chip T1100, still using it as a desktop in my office. It was top of the line AMD chip, but certainly not the best chip at the time (Sandy Bridge was overall much better), nevertheless it’s still powerful enough that I can all my day to day work without any performance problems, and I can’t say that about macs I’ve used given to me by my employer.
Since late 2017 I have 1700X an 8C/16T chip, I don’t predict that I will change it in foreseeable future, because I don’t really know what would I need to do for it to be not enough.
I also have really, really fast nvme drive, and you know what… it’s wasted money, i literally don’t see any difference between this nvme drive, and few times slower SATA SSD. So yeah, Mac’s are overpriced, because even if they have “premium” components, it’s premium for the sake of being premium, it’s not noticeable in day to day work.
Threadripper 2970WX is half the price of Xeon W-2195, has higher base clock, more cache, and more cores. Not sure if one need Vega card, if you do ML, there are dedicated chips for that, not sure about video editing, but if one want’s to play on a 5K monitor in native resolution… With Threadripper you can use 2933 DDR4 ECC ram, I’m really not sure if one will notice difference between that and 2666 one get in iMac’s, but if we talk premium I believe 2933 is “more premium” than 2666 Not to mention motherboard would be probably cheaper for a Threadripper than that apple custom one.
But in the end it all the above really does not matter. Apple is closed ecosystem when it comes to the software and especially when it comes to hardware. There is almost no way to add or change your hardware. If for example you need a hardware raid, sorry you’re out of luck, apple users apparently don’t need that. If you need to add more drives, you can only through USB. Another Ethernet card? A proper sound card? Sure, you can, but only through USB/Thunderbolt, and the list go on and on.
I think it actually boils down to this is overpriced for us, programmers. And one can draw a conclusion that this is simply not a computer for programmers.
I would like to hear from someone who is doing heavy video processing or say 3d scenes rendering, that results in gigabytes of I/O very very quickly, they may have notice the difference in fact.
I have a friend who does video editing as his day to day job, and for him it’s always CPU bound task not an IO bound, most of the time he keeps videos on external drives, and needs internal ssd for when there are many streams combined he needs to edit, but he does 1080p not 4K stuff.
Still 500 MB/s seq read/write is something normal for ssd. Random read/write is bit more tricky to calculate, because it depends on many factors, but even with let’s say 20IOPS it’s about 80MB/s. And 4K video is rarely more than 30-40 Mbit/s, with some high quality stuff going up to maybe 80 Mbit/s.
3D rendering is most definitely CPU/GPU bound.
Simple transcoding from one format to the other, well it depends and in that scenario nvme could probably make things faster when you have 32 cores. But then again it’s something that does not require user input at all, and is being done mostly when you’re not working on your PC, you can’t really transcode video, and edit some other on a 4 core Intel cpu, maybe with a 32 core one it would be, I really don’t know, but it’s unlikely scenario.
So my conclusion is that having very fast nvme drive is like having Ferrari for doing groceries
That’s compressed and ready to stream. What comes off a current pro-level 4k camera, and what video folks work with as source, is 1-3Gbps. Even older, lower-end 4k cameras are way higher than 30-40 Mbps.
Hey don’t worry, Apple wasn’t picking on you because you were running an older machine, it happens to me on my brand-new one. Oh, except maybe I should note one little thing: it never did it until I finally had to install Docker
I thought about that but the fact that Apple could render my install useless with an update is something I wouldn’t want to deal with.
Yes and no. Yes: because it’s not cheap for sure and we can get the same results in our normal daily work with times cheaper machines – that is true, but only if we are talking our daily programmer work and nothing else. No: because if you get the iMac Pro, you don’t have to think about upgrading your machine for 10 years, if not 15 even. And unless every app or thread in some future OS requires one physical core then I don’t see how a 16 or 32 core machine will be obsolete or underpowered anytime soon. (And if such an OS ever comes into existence, it will probably require at least 4096 cores anyway; have you seen a freshly booted Ubuntu’s
Be objective and speak for yourself. I have future ambitions in simulation areas which are related to having 100-to-1000 minimalistic VMs active at the same time. For such loads, a CPU with medium amount of cores and huge L-level caches, and 128GB of RAM, and NVMe drives – again, the iMac Pro with its Xeon W-2195 – is perfect. (Furthermore, my wife does 3D modelling and her GPU is holding on for dear life currently. Dual Vega 64 will shoot her into open space.)
I concede that AMD is well on their way of making the Xeon W obsolete. If they release a 32 or 64 core CPU with 64MB+ of L-level caches then I’ll definitely pause my ambitions for the iMac Pro and will observe what machines will be unleashed to the market using those components.
So AMD’s new chips? 64 core (128 ‘hyperthread’), although the cache hasn’t been revealed on it yet but I’d entirely expect 64 megs or more based on the current gen chips.
I think they advise (/warn!? ) you not to update the OS when you’ve got a Hackintosh - it’s the thing that put me off too.
Re AMD’s multicore CPUs (and talking about them for production servers) we had a thread about it here: AMD’s new multicore CPUs (Ryzen, Naples) and I can’t remember whether it was that thread or another now, but someone (think it may have even been ODL) said they weren’t a good fit for the BEAM (I can’t remember why now - I might have even dreamt it! )
Re whether Mac’s are specifically built for programming or not, I don’t think any computers or OS’s are really - programming itself doesn’t need anything special (apart from maybe compatibility to some degree with production environment) - however, programmers need certain things and need them to be nice. Whether that’s being able to browse the web easily and safely/make notes/edit graphics/video or screencast production/email/and even seamless integration with our phones, etc i.e, we use them for more than just programming.
I like how intuitive and generally reliable Macs are, and how they make for a pleasant overall user experience.
Having said that, I wish a viable competitor would surface - one with all of Apple’s positives but none of their (increasingly unpalatable) negatives.
Even with that I will wait because they just announced the Zen2 platform. It’ll be a while before the next-gen AMD machines surface.
Now they have a wider bus and thus their RAM latency will drop.