Careers

PySpark

Job Requirement:

  • At least 2 years of spark programming experience including data transformations using RDD’s, dataframes and Spark SQL.
  • Proficient in python and spark programming with a good knowledge.
  • Strong understanding of object-oriented programming, RDBMS, and PySpark cluster management.
  • Having knowledge in Microsoft SQL Server, and other file systems.
  • Proficient understanding of code versioning tools such as GIT.
  • Design, build and maintain efficient, reusable, and reliable code.
  • Ensure the best Performance and quality..

Hours: 

M-F, 09:30 a.m. – 06:30 p.m.

Job Location:

Hyderabad (India)

Apply Now:

Email resume referencing job code# ANBI-015 to Maruthi Technologies LLC DBA ANBLICKS at hr@anblicks.com

Apply Online