Press Coverage

AI in HPC on Wall Street: Getting Going Is the Hard Part (Enterprise Tech)

AI and Deep Learning Come to Wall Street (HPC Wire)


EnterpriseTech

AI in HPC on Wall Street: Getting Going Is the Hard Part
by Doug Black

IT managers in the financial services industry are fully aware that high performance data analytics, AI and deep learning technologies are plentiful and available, and they know that some FSI companies are already deriving advantage from them. But they hesitate to take the plunge into AI/DL themselves because of sticker shock ($200,000 for an initial implementation?), technology unfamiliarity, dearth of trained staff and lack of appropriate server infrastructure.

This might not be the case at the high end of FSI, organizations with 100-plus data scientists on hand and deep IT budgets to buy the infrastructure and consulting services to execute optimized AI strategies. But at many mid-sized and smaller FSI companies, more people are talking about AI than are actually implementing it, more are looking for a way to initiate AI than are actually taking the first steps.

Helping organizations begin their AI/DL journeys is a theme of the keynote presentation to be delivered at next week's HPC for Wall Street – Cloud and Data Centers conference in New York by Lacee McGee, HPE's worldwide senior product manager for financial services industries. Her address also will highlight the growing need for code modernization: updating custom FSI software and algorithms to take advantage of new processor architectures, along with cloud implementation strategies.

"There hasn't emerged a front-runner AI/DL solution," McGee told EnterpriseTech. "Customers are hungry for solutions, but they don't know what they entail. What we see as they start to tinker around with it, they're not ready to invest a significant amount of money needed for an optimal AI/DL set up. They want to play around on something familiar to them – a 2u standard link form factor box that they already have for the email server or some other purpose. They hope if they start on something comfortable and general-purpose then maybe they can begin down that path."

Her advice on an AI starter strategy: Go with what you know.

"So the opportunity is how can we give them little bits of AI and deep learning, packaging that starts on hardware they're familiar with?" McGee said. "Then as they grow and get more comfortable they can make smarter decisions and transition to those optimal solutions. They're looking for comfort with both technology and the price point."

This approach contracts with most AI and DL systems currently available, she said. "The systems I've seen are very specific, so it's not something that if you tried and it didn't work you could repurpose for a standard workload, like enterprise Sharepoint or your email server. So the more we can deliver flavors of AI and DL and guidance on the journey, starting at a more cost-effective price with a more uniform architecture, that should ramp up adoption more quickly."

McGee also will discuss appropriate workloads for initial AI implementations, beginning with anomaly detection, which is inherently well-suited for FSI because the enormous volumes of data generated that can be used to train DL systems. "Due to regulations and making smarter market trades in the HFT section, financial services generate so much data that they're required to store, they have an advantage over other industries that have data but don't hold onto it. Our customers are holding onto it, so they might as well leverage it and make smarter decisions in the anomaly detection space."

Another main element of her address to the FSI crowd: at-scale cloud implementation.

"It's involved with everything, whether HPC or not in the financial services industry, " McGee said, "because of fear of cybersecurity problems, or not being comfortable with new regulations – not knowing if cloud providers are tied in with new regulations will undermine adoption of cloud, whether it's public, on-premises or hybrid cloud."

McGee said cybersecurity concerns are the biggest barrier to cloud computing in FSI. "There's so much going on with cybersecurity, there's slower adoption than in other industries," she said. "It's a very small percentage of workloads customers are actively transitioning to coudl platforms. They see the benefit (from cloud computing) in saving money and accessing data. But the cost of something going wrong is so great that we're farther behind than other industry verticals."

McGee will also discuss incorporation of alternative processor architectures to speed up workflows.

"It's always been there, say with low latency high frequency trading," she said, "how do I reduce my time, how do I take advantage of emerging accelerated technologies?" She explained that simply dropping a state-of-the-art GPU into the processor mix will improve performance, the improvement will probably fall short of expectations. This is because algorithms and codes need to be optimized to fully utilize new architectures, McGee said. Large FSIs with big IT staffs can take on those challenges in-house, but smaller companies in the industry increasingly are engaging code modernization consulting firms to help them update their proprietary software.


AI and Deep Learning Come to Wall Street
By Vineeth Ram – VP, HPC Segment and DCIG Experience Marketing, Data Center Infrastructure Group Marketing, Hewlett Packard Enterprise

April 10, 2017

Today’s financial services companies must continually strive to gain a competitive edge in a highly data-intensive industry. With the emergence of big data, firms are struggling to manage the onslaught of complex data from many sources, stay on top of evolving regulations, and boost data security. High performance computing (HPC) technologies are not only helping financial firms ease the pains associated with explosive data growth, but they are also becoming absolutely essential to survival.

Emergent technologies and new HPC innovations to hit the financial sector were the focus of the 14th annual HPC on Wall Street event held earlier this week in New York City. The one-day show and conference brings together key FinTech players, financial industry experts, and capital markets IT providers to promote a discussion of the latest technological advances in the realm of financial HPC.

Each year, the show features a variety of networking opportunities, exhibits, expert panel discussions, and keynote speeches meant to help financial CIOs learn about new technologies, discuss industry trends and developments, and collaborate with experts that can help them address their biggest IT challenges.

This year, Hewlett Packard Enterprise (HPE) teamed up with SUSE and Intel® to host a keynote session entitled “HPC Innovation for the FSI Market.” The keynote, which was delivered by Lacee McGee, Senior Product Manager for FSI Vertical Solutions at HPE and Natalia Vassilieva, a Senior Research Manager at HP Labs, as well as representatives from SUSE and Intel®, concentrated on ways that financial organizations can begin their journeys into game-changing technologies like artificial intelligence (AI) and deep learning. Adoption of these emerging technologies is beginning to proliferate among companies in this sector, a fact which served as a persistent theme throughout the conference.

As McGee recently explained to EnterpriseTech, simply getting started can be the most difficult part for financial companies when it comes to implementing AI and deep learning technologies. While larger enterprises may have broader capital budgets and IT expertise, small and mid-sized businesses (SMBs) sometimes hesitate to take the plunge for a variety of reasons, including cost, lack of IT expertise, shortage of trained staff, or insufficient server infrastructure. McGee’s advice for SMBs was to start slow, and begin by having a trusted IT vendor package small amounts of AI and deep learning on familiar hardware. This gradual approach, meant to allow SMBs to grow with the technology and progressively increase their familiarity with these technologies, can help them ease the transition to more optimal solutions in the future.

The presentation also touched on the broadening demand for code modernization tools that can help quantitative analysts circumvent the steep learning curve associated with today’s multi-core processor architectures. The HPE Quantitative Finance Library was recently introduced to address these challenges, a solution that modernizes application software using a productivity tool that generates highly parallelized source code for multi-core, multi-thread platforms. This helps quants exploit the advantages of the latest Intel® Xeon Phi™ processors more quickly and with less time spent away from critical business activities.

There’s no denying that HPC has become essential to survival in the financial services industry. As data growth continues to bring new challenges and increased complexity to the financial services industry, HPC solutions are offering unprecedented levels of speed and efficiency, as well as whole new ways of doing business. Visit the HPC for Wall Street website for additional conference information and press coverage. You can also follow me on Twitter at @VineethRam to stay informed about the latest technologies and HPC finance innovations impacting the industry. We hope to see you at next year’s event!