Job Description:
Overall Purpose
This position will manage a team of 10+ data engineers and interact on a consistent basis with other developers, architects, data product owners and source systems. This position requires multifaceted candidates who have experience in data engineering, data analysis, visualization, good hands-on experience PySpark & Databricks, and in Azure Cloud Platform & Services
Key Roles and Responsibilities
Build end-to-end data & business intelligence solutions. This includes data extraction, ETL processes applied on data to derive useful business insights, and best representing this data through dashboards.
Write complex SQL queries used to transform data using Python/Unix shell scripting
Understand business requirements and create visual reports and dashboards using Power BI or Tableau.
Upskill to different technologies, understand existing products and programs in place
Work with other development and operations teams.
Flexible with shifts and occasional weekend support.
Key Competencies
Full life-cycle experience on enterprise software development projects.
Experience in relational databases/ data marts/data warehouses and complex SQL programming.
Extensive experience in ETL, shell or python scripting, data modelling, analysis, and preparation
Experience in Unix/Linux system, files systems, shell scripting.
Extensive knowledge in Python, PySpark, Databricks, Azure Cloud Platform & Services
Good to have experience in BI Reporting tools – Power BI or Tableau
Good problem-solving and analytical skills used to resolve technical problems.
Must possess a good understanding of business requirements and IT strategies.
Experience in presentation design, development, delivery, and good communication skills to present analytical results and recommendations for action-oriented data driven decisions and associated operational and financial impacts.
Experience in managing a team of 10 or more, is involved in appraisal and rating process
Required/Desired Skills
Cloud Platforms - Azure, Databricks, Deltalake (Required 5-6 years)
SQL Programming, Python/PySpark ETL (Required 8+ Years)
Unix/Linux shell scripting (Required 6-8 years)
RDBMS and Data Warehousing (Required 8+ Years)
Iceberg enablement (Desired 2-3 years)
Managed a team for 2+ years
Snowflake & Power BI / Tableau (Good to have)
Weekly Hours:
40Time Type:
RegularLocation:
IND:AP:Hyderabad / Argus Bldg 4f & 5f, Sattva, Knowledge City- Adm: Argus Building, Sattva, Knowledge CityIt is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.
Job ID R-91908 Date posted 12/23/2025