LinkedIn link GitHub link

GPU Technology Conference in Munich

The GPU Technology Conference (GTC) organized by NVIDIA is one of the most important events around GPU computing for developers, data scientists and decision makers. This year’s GTC Europe in Munich attracted more than 3’000 visitors!

No doubt, 2017 has been the year of artificial intelligence with all the advances in the field. The diverse set of talks and sessions at the GTC made this very clear. Be it autonomous vehicles, healthcare, finance, virtual & augmented reality and even art - deep learning applications are transforming industries.

Some personal highlights of the conference were:

Deep Learning in Healthcare (Bram van Ginneken, Radboudumc)

Bram provided an overview over the recent developments in radiology, pathology and ophthalmology. Challenges such as the Kaggle Data Science Bowl, CAMELYON16 and CAMELYON17 have shown that deep neural networks often outperform human pathologists in the early detection of lung and breast cancer. It’s no surprise that he predicts that these developments will have a major impact on healthcare.

PyTorch: A framework for fast, dynamic deep learning and scientific computing (Soumith Chintala, Facebook)

PyTorch is a relatively new Python framework aimed at scientific computing and dynamic deep learning applications. It provides tensor computation functionality similar to NumPy but with GPU acceleration. It further has an automatic differentiation engine, a package for gradient-based optimization and efficient parallel data loaders. The PyTorch team is currently working on a JIT compiler to support kernel fusion, out-of-order execution and automatic workload placement.

The role of GPUs in Geovisualization (Todd Mostak, MapD) / How GPUs enable XVA pricing and risk calculations for risk aggregation (James Mesney, Kinetica)

GPUs have also arrived in the database world. Todd from MapD showed how they use GPUs to power real-time visual analytics on massive multi-billion row datasets. James from Kinetica provided insight how their financial services clients are using in-database analytics to run custom XVA algorithms with massive GPU parallelization.


These are exciting times for deep learning and scientific computing in general!