Best Practices for Unit Testing PySpark

Описание к видео Best Practices for Unit Testing PySpark

This talk shows you best practices for unit testing PySpark code. Unit tests help you reduce production bugs and make your codebase easy to refactor. You will learn how to create PySpark unit tests that run locally and in CI via GitHub actions. You will learn best practices for structuring PySpark code so it’s easy to unit test. You’ll also see how to run integration tests with a cluster for staging datasets. Integration tests provide an additional level of safety.

Talk By: Matthew Powers, Staff Developer Advocate, Databricks

Here’s more to explore:
Big Book of Data Engineering: 2nd Edition: https://dbricks.co/3XpPgNV
The Data Team's Guide to the Databricks Lakehouse Platform: https://dbricks.co/46nuDpI

Connect with us: Website: https://databricks.com
Twitter:   / databricks  
LinkedIn:   / data…  
Instagram:   / databricksinc  
Facebook:   / databricksinc  

Комментарии

Информация по комментариям в разработке