Nearly every enterprise is experimenting with artificial intelligence and deep learning. It seems like every week there’s a new survey out detailing the ever-increasing amount of focus that IT shops of all sizes put on the technology. If it’s true that data is the new currency, then it’s artificial intelligence that mines that data for value. Your C-suite understands that, and its why they continually push to build AI and machine learning capabilities.
Nowhere is AI/ML more impactful than in the world of government and government contractors. It’s not just the usual suspects of defense and intelligence who demand these capabilities—AI/ML is fast becoming a fact-of-life across the spectrum of government agencies. If you’re a government contractor, then you’re already seeing AI/ML in an increasing number of RFP/RFQs.
AI impacts everything
I’m a storage analyst. I don’t like to think about AI. I like to think about data. I advise my clients on how storage systems and data architecture must evolve to meet the needs of emerging and disruptive technologies. These days, those technologies all seem to be some variation of containerized deployments, hybrid-cloud infrastructure and enterprise AI. There’s no question that Artificial Intelligence is the most disruptive.
High-power GPUs dominate machine learning. Depending on the problem you’re trying to solve, that may be one GPU in a data scientist’s workstation, or it may be a cluster of hundreds of GPUs. It’s also a certainty that your deployment will scale over time in ways that you can’t predict today.
That uncertainty forces you to architect your data center to support the unknown. That could mean deploying storage systems that have scalable multi-dimensional performance that can keep the GPUs fed, or simply ensuring that your data lakes are designed to reduce redundancies and serve the needs of all that data’s consumers.
These aren’t problems of implementing AI, but rather designing an infrastructure that can support it. Most of us aren’t AI experts. We manage storage, servers, software or networking. These are all things will be disrupted by AI in the data center.
The single best way to prepare for the impacts of AI in the data center is to become educated on what it is and how it’s used. The dominant force in machine learning and GPU technology for AI is NVIDIA. Thankfully, NVIDIA has a conference to help us all out.
NVIDIA’s GPU technology conference for AI
Every spring NVIDIA hosts its massive GPU Technology Conference (GTC) near its headquarters in Silicon Valley. It’s there where 6,000+ attendees gather to hear about all aspects of what NVIDIA’s GPUs can do. This ranges from graphics for gaming and visualization, to inference at the edge, to deep learning in the enterprise. It’s one of my favorite events each year (read my recap the most recent GTC here, if interested).
Next week NVIDIA brings a more focused version of its GTC conference to Washington, DC. Gone are the sessions and talks about gaming, with the focus instead on the business of deploying artificial intelligence with a purpose. NVIDIA’s GTC DC focuses on autonomous machines, cybersecurity, computer vision, HPC, robotics and augmented reality. These are the topics that dominate discussions of AI around the DC beltway.
As much as I enjoy watching NVIDIA’s CEO Jensen Huang give a keynote, I’m much more excited to hear Ian Buck’s keynote at GTC DC. Ian Buck is NVIDIA’s vice president of accelerated computing. He’s also the man who invented CUDA, the programming framework that allows GPUs to be used for machine learning.
Ian won’t be talking about CUDA much. I expect he’ll focus on the real problems that can be solved by the technology. According to NVIDIA, he’ll talk extensively about how organizations of all types can best utilize the power of artificial intelligence to boost their competitiveness.
Beyond NVIDIA, there are speakers from the US OMB, the White House Office of Science and Technology Policy, the NIH and more. Exhibitors will include over fifty companies showing off AI, robotics, and high-performance computing. Key exhibitors including powerhouse government contract players Booz Allen Hamilton, Lockheed Martin and Dell Technologies.
Artificial intelligence is in the data center with a rapidly growing footprint. It’s already impacting every area of IT architecture. IT practitioners, whether directly involved in AI or not, need to understand how that affects their areas of expertise.
I don’t work in AI, but I’ll still be in Washington, DC, next week for NVIDIA’s GTC DC event. It’s critical that I understand how these technologies impact the world that I live in. The best way to prepare for the future is to understand it.
NVIDIA is inventing the future of machine learning right in front of our eyes. You should come to DC next week and take a look for yourself. You’ll be in good company.