5. kpmg pyspark interview question & answer | databricks scenario based interview question & answer

Описание к видео 5. kpmg pyspark interview question & answer | databricks scenario based interview question & answer

#Databricks #PysparkInterviewQuestions #deltalake
Azure Databricks #spark #pyspark #azuredatabricks #azure
In this video, I discussed KPMG PySpark scenario based interview questions and answers.

PySpark advanced interview questions answers?
databricks interview questions and answers?
kpmg pyspark interview questions and answers?

Create dataframe:
======================================================
#Employees Salary info
data1=[(100,"Raj",None,1,"01-04-23",50000),
(200,"Joanne",100,1,"01-04-23",4000),(200,"Joanne",100,1,"13-04-23",4500),(200,"Joanne",100,1,"14-04-23",4020)]
schema1=["EmpId","EmpName","Mgrid","deptid","salarydt","salary"]
df_salary=spark.createDataFrame(data1,schema1)
display(df_salary)
#department dataframe
data2=[(1,"IT"),
(2,"HR")]
schema2=["deptid","deptname"]
df_dept=spark.createDataFrame(data2,schema2)
display(df_dept)

-----------------------------------------------------------------------------------------------------------------------
df=df_salary.withColumn('Newsaldt',to_date('salarydt','dd-MM-yy'))
display(df)
---------------------------------------------------------------------------------------------------------------------
from pyspark.sql.functions import col
df1=df.join(df_dept,['deptid'])
#display(df1)
df2=df1.alias('a').join(df1.alias('b'),col('a.Mgrid')==col('b.EmpId'),'left').select(
col('a.deptname'),
col('b.EmpName').alias('ManagerName'),
col('a.EmpName'),
col('a.Newsaldt'),
col('a.salary')
)
display(df2)
-------------------------------------------------------------------------------------------------------------------
from pyspark.sql.functions import year,month
df3=df2.groupBy('deptname','ManagerName','EMpName',year('Newsaldt').alias('Year'),date_format('Newsaldt','MMMM').alias('Month')).sum('salary')

display(df3)
============================================================

Learn PySpark, an interface for Apache Spark in Python. PySpark is often used for large-scale data processing and machine learning.

Azure data factory tutorial playlist:
   • Azure Data factory (adf)  

ADF interview question & answer:
   • adf interview questions and answers f...  

1. pyspark introduction | pyspark tutorial for beginners | pyspark tutorial for data engineers:
   • 1. pyspark introduction | pyspark tut...  

2. what is dataframe in pyspark | dataframe in azure databricks | pyspark tutorial for data engineer:
   • 2. what is dataframe in pyspark | dat...  

3. How to read write csv file in PySpark | Databricks Tutorial | pyspark tutorial for data engineer:
   • 3. How to read write csv file in PySp...  

4. Different types of write modes in Dataframe using PySpark | pyspark tutorial for data engineers:
   • 4. Different types of write modes in ...  

5. read data from parquet file in pyspark | write data to parquet file in pyspark:
   • 5. read data from parquet file in pys...  

6. datatypes in PySpark | pyspark data types | pyspark tutorial for beginners:
   • 6. datatypes in PySpark | pyspark dat...  

7. how to define the schema in pyspark | structtype & structfield in pyspark | Pyspark tutorial:
   • 7. how to define the schema in pyspar...  

8. how to read CSV file using PySpark | How to read csv file with schema option in pyspark:
   • 8. how to read CSV file using PySpark...  

9. read json file in pyspark | read nested json file in pyspark | read multiline json file:
   • 9. read json file in pyspark | read n...  

10. add, modify, rename and drop columns in dataframe | withcolumn and withcolumnrename in pyspark:
   • 10. add, modify, rename and drop colu...  

11. filter in pyspark | how to filter dataframe using like operator | like in pyspark:
   • 11. filter in pyspark | how to filter...  

12. startswith in pyspark | endswith in pyspark | contains in pyspark | pyspark tutorial:
   • 12. startswith in pyspark | endswith ...  

13. isin in pyspark and not isin in pyspark | in and not in in pyspark | pyspark tutorial:
   • 13. isin in pyspark and not isin in p...  

14. select in PySpark | alias in pyspark | azure Databricks #spark #pyspark #azuredatabricks #azure
   • 14. select in PySpark | alias in pysp...  

15. when in pyspark | otherwise in pyspark | alias in pyspark | case statement in pyspark:
   • 15. when in pyspark | otherwise in py...  

16. Null handling in pySpark DataFrame | isNull function in pyspark | isNotNull function in pyspark:
   • 16. Null handling in pySpark DataFram...  

17. fill() & fillna() functions in PySpark | how to replace null values in pyspark | Azure Databrick:
   • 17. fill() & fillna() functions in Py...  

18. GroupBy function in PySpark | agg function in pyspark | aggregate function in pyspark:
   • 18. GroupBy function in PySpark | agg...  

19. count function in pyspark | countDistinct function in pyspark | pyspark tutorial for beginners:
   • 19. count function in pyspark | count...  

20. orderBy in pyspark | sort in pyspark | difference between orderby and sort in pyspark:
   • 20. orderBy in pyspark | sort in pysp...  

21. distinct and dropduplicates in pyspark | how to remove duplicate in pyspark | pyspark tutorial:
   • 21. distinct and dropduplicates in py...  

Комментарии

Информация по комментариям в разработке