PwC Hiring for Data Engineer
Data Engineer
Bangalore, Karnataka, India, Hyderabad , India, Kolkata
Posted 3 weeks ago
Company Name
PwC
Job Title
Data Engineer
Experience
0+ years of Experience
Location
Hyderabad, Bangalore, Kolkata,
Requirements
- 0+ years of experience in data engineering, working with Python and SQL.
- Exposure to cloud platforms such as AWS, Azure, or Google Cloud is preferred.
- Familiarity with data visualization tools (e.g., Power BI, Tableau, QuickSight) is a plus.
- Basic understanding of data modeling, ETL processes, and data warehousing concepts.
- Strong analytical and problem-solving skills, with attention to detail.
- Bachelor’s degree in Computer Science, Data Science, Information Technology, or related
fields. - Basic knowledge of cloud platforms and services is advantageous.
- Strong communication skills and the ability to work in a team-oriented environment.
Rules and Responsibilities
- Develop, optimize, and maintain data pipelines and workflows to ensure eƯicient data integration from multiple sources.
- Use Python and SQL to design and implement scalable data processing solutions.
- Ensure data quality and consistency throughout data transformation and storage
processes. - Collaborate with data architects and senior engineers to build data solutions that meet
business and technical requirements. - Work with cloud platforms (e.g., AWS, Azure, or Google Cloud) to deploy and maintain data
solutions. - Support the migration of on-premise data infrastructure to the cloud environment when
needed. - Assist in implementing cloud-based data storage solutions, such as data lakes and data
warehouses. - Data Visualization: Provide data to business stakeholders for visualizations using tools such as Power BI, Tableau, or QuickSight. Collaborate with analysts to understand their data needs and optimize data structures for reporting.
- Collaboration and Support: Work closely with cross-functional teams, including data scientists and business analysts, to support data-driven decision-making. Troubleshoot and resolve issues in the data pipeline and ensure timely data delivery. Document processes, data flows, and infrastructure for team knowledge sharing.