Deloitte – Technology & Transformation Hiring (Multiple Roles) Job Referral By FLM

Deloitte is Hiring – Data Engineer Job Referral By FLM

Deloitte – Technology & Transformation Hiring (Multiple Roles) Job Referral By FLM. The details of the job, requirements and other information given below:

Table of Contents

Deloitte – Technology & Transformation Hiring (Multiple Roles) Job Referral By FLM


1.Job Title: Technology & Transformation – Engineering – Consultant / Senior Consultant – Azure Data Engineer

Desired Qualifications

  • 3–12 years of hands-on experience implementing Azure Cloud data warehouses, Azure and NoSQL databases, and hybrid data scenarios

  • Experience developing Azure Data Factory including Azure Functions, Logic Apps, Triggers, and Integration Runtime (IR)

  • Experience working with Databricks (PySpark, Scala), Stream Analytics, Event Hub, and HDInsight components

  • Experience in working on Data Lake and Data Warehouse solutions on Azure

  • Experience managing Azure DevOps pipelines (CI/CD)

  • Experience managing source data access security using Vault, configuring authentication and authorization, and enforcing data policies and standards

  • UG: B.Tech / B.E in any specialization

  • Base location: Bengaluru / Hyderabad / Pune / Delhi / Coimbatore / Bhubaneswar

2.Job Title: T&T – EAD-ADMM – Senior Consultant | GCP Data Engineer 

Desired Qualifications

  • Good hands-on experience in GCP services including BigQuery, Cloud Storage, Dataflow, Cloud Dataproc, Cloud Composer / Airflow, and IAM

  • Proficient experience in GCP Databases: Bigtable, Spanner, Cloud SQL, and AlloyDB

  • Proficiency in SQL, Python, Java, or Scala for data processing and scripting

  • Experience in development and test automation processes through CI/CD pipelines using Git, Jenkins, SonarQube, Artifactory, and Docker containers

  • Experience orchestrating data processing tasks using Cloud Composer or Apache Airflow

  • Strong understanding of data modeling, data warehousing, and big data processing concepts

  • Solid experience with relational databases such as SQL, MySQL, PostgreSQL, or Oracle

  • Ability to design and implement data migration strategies for PostgreSQL, Oracle, AlloyDB, etc.

  • Deep understanding of at least one database with the ability to write complex SQL queries

  • Experience with NoSQL databases like MongoDB, Scylla, Cassandra, or DynamoDB is a plus

  • Ability to optimize data pipelines for performance and cost-efficiency following GCP best practices

  • Experience implementing data quality checks, validation, and monitoring mechanisms

  • Ability to collaborate with data scientists, analysts, and business stakeholders

  • Ability to work independently and manage multiple priorities

  • Preferably having expertise in end-to-end Data Warehouse implementation

  • Base location: Bangalore, Mumbai, Pune, Hyderabad

3.Job Title: T&T | Engineering | Power BI

Work You’ll Do

  • Build and maintain positive working relationships with teams and clients

  • Experience with Power BI Desktop, Data Modeling, Power Query, advanced data connectivity, and monitoring

  • Develop visual reports, KPI scorecards, and dashboards using Power BI Desktop

  • Write DAX queries

  • Build analytics tools that provide actionable insights into operational efficiency and business performance metrics

  • Work with stakeholders to resolve data-related technical issues and support data infrastructure needs

Your Role as a Leader

  • Develop high-performing people and teams through meaningful opportunities

  • Deliver exceptional client service and drive high performance while fostering collaboration

  • Influence clients, teams, and individuals positively by leading through example

  • Understand key objectives for clients and Deloitte and align people accordingly

  • Act as a role model by embracing Deloitte’s purpose and values

4.Job Title: Technology and Transformation – EAD-MS – Consultant – AWS Data Engineer

Responsibilities

  • Design, build, and maintain scalable data pipelines on AWS

  • Develop ETL/ELT workflows using AWS services like Glue, Lambda, EMR, and Step Functions

  • Work with data ingestion frameworks using Kinesis, Kafka, or SQS

  • Build and optimize data lakes using S3, Glue Catalog, and Lake Formation

  • Work with Redshift and Snowflake for data warehousing solutions

  • Implement data quality checks, monitoring, logging, and CI/CD deployments

  • Collaborate with analytics and business teams to deliver high-quality datasets

Skills Required

  • Strong hands-on experience with AWS Glue, S3, Lambda, IAM, and Step Functions

  • Proficiency in Python (PySpark), SQL, and data transformation logic

  • Experience with Redshift, Athena, and Glue Data Catalog

  • Knowledge of Spark and distributed data processing

  • Familiarity with Terraform or CloudFormation is a plus

  • Strong understanding of data modeling, data lake principles, and best practices

  • Base location: Pune, Bangalore, Hyderabad, Bhubaneswar, Chennai, Coimbatore

5.Job Title: T&T | Engineering | Databricks + PySpark

Mandatory Skills

  • Databricks

  • Spark

  • Python / SQL

Responsibilities

  • Design, develop, and optimize data workflows using Databricks to ingest, transform, and load data

  • Build scalable and efficient data processing workflows using Spark (PySpark or Spark SQL)

  • Collaborate with technical and business stakeholders to understand data requirements

  • Develop data models and schemas for reporting and analytics

  • Ensure data quality, integrity, and security through appropriate controls

  • Monitor and optimize data processing performance and resolve bottlenecks

  • Stay updated with advancements in data engineering and Databricks technologies

Qualifications

  • Bachelor’s or Master’s degree in any field

  • Experience designing and maintaining data solutions on Databricks

  • Experience with at least one cloud platform: Azure, AWS, or GCP

  • Experience with ETL and ELT processes

  • Knowledge of data warehousing and data modeling concepts

  • Experience with Python or SQL

  • Experience with Delta Lake

  • Understanding of DevOps principles and practices

  • Strong problem-solving, troubleshooting, communication, and teamwork skills

NOTE: We will be sharing your details with an Deloitte employee, He will refer you for the opportunity

  If you fit any of these roles, don’t miss this opportunity!

Cheers 🥂

Frontlines Media (FLM)

Join Our Telegram Group (1.9 Lakhs + members):- Click Here To Join

For Experience Job Updates Follow – FLM Pro Network – Instagram Page

For All types of Job Updates (B.Tech, Degree, Walk in, Internships, Govt Jobs & Core Jobs) Follow – Frontlinesmedia JobUpdates – Instagram Page

For Healthcare Domain Related Jobs Follow – Frontlines Healthcare – Instagram Page

For Major Job Updates & Other Info Follow – Frontlinesmedia – Instagram Page

Releated Posts

First 2M+ Telugu Students Community