APIs, GPUs, and drivers: CAD graphical conspiracy?

Graphics performance is no doubt key to CAD productivity. Common recommendations involve running up the bill with professional level GPUs in certified hardware configurations. But is such a setup a wise investment? Hardware and software vendors tout how certified professional graphics cards are all that, and throw down the benchmarks to prove it. Many CAD hardware enthusiasts, however, contend that pro cards are perhaps an alien conspiracy designed to empty your pockets, and that consumer-grade gaming GPUs are up to the task at a fraction of the cost. The truth in the bewildering world of CAD graphics is complicated. But it’s out there. You see a pattern emerging here, Scully?


Supernatural hardware shenanigans

The first point of contention in our little investigation: are professional GPUs running superior hardware? The answer to that is mostly no. Modern pro-level graphics cards share the exact same core hardware as their consumer counterparts and as far as the silicon is concerned, have equal graphics compute potential. We touched on this particularly irritating fact when making graphics card recommendations for our kick-ass CAD workstation.

There’s minor variation with regard to binning, a common practice with mass-produced silicon, where specific chips are sorted based on their performance on internal quality tests during manufacture. Chips that well exceed the quality threshold are “binned” as more expensive professional parts on an assertion of reliability, while those that don’t do as well, but still nonetheless still pass, are destined for lower-cost consumer markets. But the GPU cores are still the same.

Furthermore, the improved reliability of modern chip foundries make such distinctions practically irrelevant. In some cases, ECC memory is used onboard for pro-level cards, but that’s not going to matter all that much either. Selling the same thing at two different price points sure does sound like something the cigarette-smoking man might use to pull your chain.

It gets worse

Believe it or not, in past years some professional level cards actually carried GPUs that were a generation behind their consumer equivalents. While that smells like the worst kind of conspiracy, most of this had to do with the longer lifecycle of CAD workstation hardware, where machines are typically leased/purchased on 3-5 year cycles. During that same time, hardcore gamers on the consumer side were pining after new and exciting hardware every 6-12 months.

Despite the fact that the silicon is the same, there’s a deeper mystery. Take a pro card and a consumer card on the same workstation, fire up Specviewperf, run some tests (say for Solidworks, NX, or CATIA) and you’ll find that consumer cards get destroyed in the benchmarks.

Every. Single. Time.

If there’s no magical flux-capacitor in there to grant everlasting CAD goodness, then how, save for intervention by the supernatural, is such a thing possible? The truth, just like that Fight for the Future movie, may make you just shake your head in disapproval. The secret sauce is in the drivers.

Graphics driver secret police

Level two of the conspiracy is the graphics drivers which contain optimizations that specifically accelerate a variety of professional graphics applications, including CAD. The drivers are engineered to run only on the professional cards. Not because the hardware is more capable, but because they just won’t let you.

Consumer graphics hardware is excluded from the inner circle.

Understandably, when confronted with such an unsettling truth, some enterprising CAD enthusiasts revolted, finding ways around such artificial limitations by soft or hard modding consumer cards to run an optimized driver. You can bet it wasn’t long before the graphics card secret police closed the loophole and disposed of the evidence.

How can such shadowy evil be justified? Truth is, quite a bit of time and effort goes into creating and maintaining those driver optimizations. The process for creating certifications for each driver version, among the plethora of available hardware configurations and CAD platforms is onerous. So the cost of all that is passed to those that directly benefit –the professional users– while gamer cards are withheld from that privilege because they just didn’t pay to play.

But that’s still not all of the truth, the web of graphical intrigue goes deeper still, into very old battles over technology standards. Take for example benchmarks with AutoCAD, where the gaming cards actually prevail. How does something like AutoCAD manage to perform without optimized drivers? It’s all about an old war over graphics application programming interfaces (APIs).

The API enigma

As we peel the onion back layer by layer, the fact that most CAD platforms are dependent on a graphics API called Open GL is revealed. It’s a fitting joke that the API sounds like a bad 90s band, considering that the Open GL implementation for CAD hasn’t changed significantly for about as long. Open GL’s strength is in its extensibility, which was key in introducing CAD-specific rendering optimizations in a time when consumer-facing 3D graphics were in their infancy.

Open GL also featured prominently in the early evolution of 3D gaming, getting a notable boost from Quake ala John Carmack: full-time genius, the man behind Doom, failed rocket engineer, and now fully indentured Facebook employee. But chiefly this was from a time when games looked like some kind of neo-cubist artistic movement, with rough polygons big enough to land a plane on.

Gaming and consumer 3D has evolved significantly since. In the intervening years, Microsoft waged war on Open GL by creating, promoting, and evolving their own 3D API for Windows called DirectX. It’s a war they handily won, and that consequently many have forgotten. That’s why most consumer cards have crap OpenGL drivers, it’s something that no one care about anymore, not even Carmack. It’s a great topic to argue about at the local bar, after you’ve already flipped the table over arguing about Linux as a viable consumer OS and the host not-so-politely asked you to shut the hell up and talk about something else (like that’s ever happened).

In 2010, Autodesk abandoned OpenGL and embraced DirectX. And now you know why AutoCAD works well with the consumer level cards.

With a DirectX rendering pipeline, consumer level graphics are more than enough for all but the most complex CAD applications. Expect more CAD platforms to follow in the future. Not to mention that for any browser-based CAD platform, the whole question is moot.

So will the very foundations of the professional graphics card market come crumbling down? Here comes the part that may make you happy then make you sad. Like when Scully and Mulder stare at each other for entirely too long.

The hard truth

CAD models just aren't complex enough to justify professional graphics anymore. Respectable 3D capability is becoming a commodity, as even common handheld devices start pushing enough pixels to handle the average design.

CAD is no longer near the tip of the spear for graphics rendering.

That may sound like an insult, or worse, grounds for a fight. While some of the most complex CAD assemblies still justify the current market segmentation, the average model is no comparison to mind-boggling developments in other corners of the professional graphics market. CAD is competing with the people who are busy rendering procedural fire or are cranking out the next Star Wars. CAD is overshadowed by James Cameron raising the bar on giant cat people, or Pixar generating photorealistic landscapes for The Good Dinosaur.

Open GL extensions for professional graphics functionality still matter to a point, but just not for CAD for much longer. The frustration will lie with the CAD software providers, many may be slow to adapt to this reality. But with every tablet that’s used to run CAD, or for every CAD user who just runs a consumer card and doesn’t miss the professional option much, that reality will get ever closer.

A Part Number Anthology part-number-anthology-small

Part numbering. For most engineers, this two-word phrase is all it takes to conjure up especially strong feelings about what it means to be “right”, and what it means to be very, very “wrong.” We've assembled a handful of our part number greatest hits in this eBook anthology.

  • Evan Sullivan

    Hi Ed

    This is a refreshing article. Many of your points are spot-on, especially regarding the need for massive graphics compute power. However, I take issue with one point you’ve made in comparing Direct3D to OpenGL:

    “Not to mention that for any browser-based CAD platform, the whole question is moot.”

    At least with GrabCAD’s own viewer and OnShape, the question is not moot. Both use WebGL for 3D rendering, which is based on the OpenGL API. it seems that as browser-based application become more popular, the demand for high-quality OpenGL drivers will increase for consumers and professionals alike. Direct3D is propped up by the gaming industry alone.

    • Evan,
      You’re absolutely right about WebGL.

      The thing is some browsers like Chrome or Firefox in Windows use ANGLE to convert WebGL into Direct3D calls. Why do they do this? Because DirectX has better driver support for the consumer cards.

      Now granted there’s no reason that better OpenGL drivers couldn’t be available, but that might undermine a purposefully bifurcated graphics market. Chances are things will get interesting in the future, with repurposing of things like AMD/ATI’s Mantle and perhaps a new API Nvidia may be cooking up – the push will be toward democratization at which point hardware with the same chips should perform similarly, don’t you think?

      • Evan Sullivan

        You’re right about ANGLE, Ed. I didn’t realize Chrome and Firefox were converting WebGL calls to equivalent Direct3D calls. I’d be happy to see the split between OpenGL and Direct3D disappear, but if that happens by way of AMD and Nvidia both releasing their own low-level APIs, I’m not sure we’d be better off. Application developers would have to choose which brand to optimize for, causing consumer confusion about which brand to get, rather than confusion about which class of card to get.

        It looks like AMD is trying to democratize the low-level API space by donating Mantle to the Khronos Group, which is now branding it as Vulkan. It would be nice to see Nvidia join that effort with their own low-level API ambitions.

        Thank you for the food for thought.

  • cadman777


    Great review!
    Interesting facts!

    Please allow me to add my experience:

    I do CAD work for a living (20+ years), and can say beyond any doubt that the newer GeForce 660Ti (3 gigs of ram) can NOT handle what the older Quadro FX4600 (768 megs of ram) can. I switched back to a legacy Quadro and will use Quadro EVERY TIME over GeForce (or gaming cards) b/c of performance issues. I found this to be true of other ‘gaming’ cards vs. CAD cards.

    I use Autodesk Inventor daily. I use Autocad occasionally. My Autocad installation can’t use the GeForce cards, due to too many compatibility issues. Video card drivers MUST be certified to work w/Autocad. I don’t like it any more than anybody else, but ‘that’s life’!

    Cheers … Chris

  • Chris Leamon

    I use a Radeon R9 290X at home and a Quadro 4000 on my workstation at work. By all rights my home computer is far superior to my workstation in both CPU, RAM, and supposedly graphics but while my home computer runs Creo 2.0 it doesn’t do it near as well as my workstation.
    There must be more to it than that although I wish it wasn’t those professional cards are outrageous.

  • Dr. Walter Black

    I like the comments so far, many of which do no support your approach. Do you have any studies/tests to backup these assertions?

    Also, what about Revit, MAX, Maya? Do these need professional graphics cards? Again, do you have testing data

    • Walter,

      In case you missed the link within the article, here’s some interesting AutoCAD benchmarks: http://www.tomshardware.co.uk/best-workstation-graphics-card,review-32728-6.html

      You’ll find from application to application your mileage may vary depending on which applications are designed to use OpenGL versus DirectX, and how each API is specifically implemented. Take Maya, for instance, which actually gives you a choice of renderer in Viewport 2.0: https://knowledge.autodesk.com/support/maya-lt/learn-explore/caas/CloudHelp/cloudhelp/2016/ENU/MayaLT/files/GUID-BF017019-B89A-47F0-8AB5-106C058AB854-htm.html The default setting is OpenGL, by the way. As you can see results can be perplexing: http://www.tomshardware.com/reviews/geforce-gtx-780-ti-review-benchmarks,3663-12.html

      Most CAD software depends on OpenGL. The driver optimizations for OpenGL only support the professional graphics line, not because your consumer hardware is incapable, but that the market is purposefully bifurcated (because reasons). What’s interesting -and the core of my message- is that some users use consumer cards anyway, despite suboptimal performance, simply because such a setup might be good enough for what they are doing. Again, it depends on what you’re working on your mileage will certainly vary – which is why everyone is so confused about graphics choices on most days.

      I’d love to put together something to provide more definitive guidance, but with the hardware and software I have access to at the moment, it’s going to be apples and oranges. However, if that’s the sort of thing you and others would be interested in, let the folks at GrabCAD know.