Data Engineering Solution Architect, Hyderabad
AALUCKS Talent Pro
Full-time
Hyderabad, Telangana, IndiaUSD 4,000,000 - 4,800,000/monthPosition: Data Engineering Solution Architect, Hyderabad
Department: Information Technology | Role: Full-time | Experience: 8 to 12 Years | Number of Positions: 1 | Location: Hyderabad
Skillset:
Data Architect, Snowflake, DBT, Matillion, Databricks, Python, Control-M, Airflow, AWS/Azure/GCP, Medallion, DevOps, Excellent English communication skills
Job Description:
About Us:
We provide companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information-driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes, we are a game-changer in any operations strategy.
We are looking for an experienced Data Engineering Solutions Architect to join our growing Data Practice. The ideal candidate will have 8 to 12 years of hands-on experience designing, architecting, and delivering large-scale data warehousing, data lake, ETL, and reporting solutions across modern and traditional data platforms. You will play a key role in defining scalable, secure, and cost-effective architectures that enable advanced analytics and AI-driven insights for our clients.
This role demands a balance of technical depth, solution leadership, and consulting mindset - helping customers solve complex data engineering challenges while also building internal capability and best practices within the organization.
Key Responsibilities:
• Design and architect end-to-end data solutions using technologies like Snowflake, Databricks, dbt, Matillion, Python, Airflow, Control-M, and cloud-native services on AWS/Azure/GCP.
• Define and implement data ingestion, transformation, integration and orchestration frameworks for structured and semi-structured data.
• Architect data lakes and data warehouses with an emphasis on scalability, cost optimization, performance, and governance.
• Support real-time and API-based data integration scenarios; design solutions for streaming, micro-batch, and event-driven ingestion.
• Lead design and delivery of data visualization and reporting solutions using tools such as Power BI, Tableau, and Streamlit.
• Collaborate with business and technical stakeholders to define requirements, design architecture blueprints, and ensure alignment with business objectives.
• Establish and enforce engineering standards, frameworks, and reusable assets to improve delivery efficiency and solution quality.
• Mentor data engineers and help build internal capability on emerging technologies.
• Provide thought leadership around modern data platforms, AI/ML integration, and data modernization strategies.
Required Qualifications:
• 8 to 12 years of experience in data engineering and architecture, including hands-on solution delivery.
• Deep expertise with Snowflake or Databricks, with strong working knowledge of tools like dbt, Matillion, SQL, and Python or PySpark.
• Experience designing and implementing data pipelines and orchestration using tools like Airflow, Control-M, or equivalent.
• Familiarity with cloud-native data engineering services (Such as AWS Glue, Redshift, Athena, GCP BigQuery, Dataflow, Pub/Sub, etc.) or similar.
• Strong understanding of data modelling, ELT/ETL design, and modern architecture frameworks (medallion, layered, or modular architectures).
• Experience integrating and troubleshooting APIs and real-time data ingestion technologies (Kafka, Kinesis, Pub/Sub, REST APIs).
• Familiarity with traditional ETL and data integration tools (Informatica, SSIS, Oracle Data Integrator, etc.).
• Excellent understanding of data governance, performance tuning, and DevOps for data (CI/CD, version control, monitoring).
• Strong communication, problem-solving, and stakeholder management skills.
Preferred Qualifications:
• Certifications such as:
Snowflake SnowPro, Databricks Certified Architect, AWS Data Analytics Specialty, or Google Professional Data Engineer.
• Prior consulting or client-facing experience.
• Exposure to AI/ML, data quality, or metadata management frameworks.
• Experience leading solution design across multi-cloud or hybrid environments.
Why Join Us?
• Work on cutting-edge data and AI engagements with global clients.
• Collaborate with a team of passionate, high-performing data professionals.
• Opportunity to influence architecture strategy and build internal capability.
• Culture that values innovation, continuous learning, and career growth.
• Competitive compensation and benefits package.
Additional Information:
This is 5-days/week, work from office role in Hyderabad location
Required Qualification:
Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)
With a fast-growing analytics, business intelligence, IT Products and automation company