Title: Senior Data Engineer (GCP)
Location:
Job Type: Full-Time
About Us: Logic Hire Software Solutions is an innovative data-centric organization committed to leveraging advanced analytics and data engineering practices to inform strategic business decisions and improve client engagement. We are seeking a seasoned Senior Data Engineer to join our team and contribute to the optimization and expansion of our data infrastructures.
Job Description
We are in search of a highly proficient Senior Data Engineer with over 12+ years of hands-on experience in architecting and maintaining data infrastructure. The ideal candidate will possess extensive expertise in leveraging the Google Cloud Platform (GCP) and the BigQuery ecosystem, alongside strong commands in SQL, SSIS (SQL Server Integration Services), SSRS (SQL Server Reporting Services), and Python. This role necessitates a combined technical acumen and strong interpersonal skills to successfully engage with business units while supporting the overall project lifecycle.
Key Responsibilities
- Architect, implement, and maintain high-performance data pipelines utilizing GCP services, particularly BigQuery, Cloud Storage, Cloud Functions, and Dataflow, ensuring optimal data flow and accessibility.
- Design and write highly efficient, scalable SQL queries, including complex joins, CTEs, and aggregations, to enable robust data analysis and reporting across multiple operational facets.
- Develop ETL (Extract, Transform, Load) processes using SSIS for operational data integration and leverage SSRS for generating executive-level reporting and analytics dashboards.
- Employ Python to create production-quality scripts and applications for data ingestion, transformation, and visualization, utilizing libraries such as Pandas, NumPy, or Apache Airflow for orchestrating workflows.
- Engage with cross-functional teams to elicit, document, and analyze business requirements, subsequently translating these into comprehensive technical specifications, data models, and workflows.
- Implement and uphold data governance frameworks to ensure data integrity, quality control, and security protocols across all data engineering processes.
- Monitor data pipelines and system performance metrics, identifying bottlenecks and implementing solutions to optimize throughput and minimize downtime.
- Provide analytical insights and recommendations to project and client management, facilitating data-driven decision-making.
- Mentor junior data engineering staff, cultivating an environment of knowledge sharing and professional development.
- Stay abreast of latest trends in data engineering technologies, tools, and methodologies to continually refine our data practices.
Qualifications
- Bachelor’s degree in Computer Science, Engineering, Data Science, or a related discipline; a Master’s degree is highly desirable.
- A minimum of 8 years of experience in the field of data engineering, particularly within GCP and the BigQuery architecture.
- Profound experience in formulating and executing complex SQL queries and a solid understanding of relational database design principles.
- Advanced proficiency with SSIS for ETL processes and SSRS for business intelligence reporting.
- Strong programming skills in Python, with a focus on data manipulation and the development of scalable ETL solutions.
- Demonstrated ability in constructing, deploying, and maintaining data engineering pipelines utilizing modern best practices.
- Strong verbal and written communication skills, complemented by an ability to liaise effectively between technical teams and business stakeholders.
- Exceptional analytical and problem-solving capabilities, with a proactive approach towards diagnosing and resolving issues.
- Working knowledge of data governance principles, compliance with data privacy regulations, and industry best practices.
Preferred Skills
- Familiarity with additional GCP services such as Cloud Dataflow for stream/batch processing, Dataproc for managing Hadoop/Spark clusters, or Pub/Sub for messaging services.
- Understanding of machine learning concepts and frameworks (e.g., TensorFlow, scikit-learn) to integrate predictive analytics within data solutions.
- Experience working within Agile environments and proficiency with project management tools (e.g., JIRA, Trello).
What We Offer
- A competitive salary and comprehensive benefits package.
- Opportunities for continued professional development and advancement within a cutting-edge environment.
- A collaborative workspace that encourages innovation and creativity.
- Flexible working options to support work-life balance.
If you possess the expertise and are eager to advance your career by driving impactful data initiatives at Logic Hire we invite you to apply. Please submit your resume and a cover letter detailing your relevant qualifications and accomplishments.
Communication in English should be proficient.
Experience in Data Engineering & Architecture - ( Data Modeling, ETL Processes, Data Pipeline Development, Data Integration, and Cloud Data Solutions (GCP)
Experience in Cloud Platforms -(Google Cloud Platform (GCP), particularly BigQuery, Cloud Storage, Cloud Functions)
Experience in Big Data Tools (Hadoop, Spark, MapReduce, Pig, Hive, NoSQL, Apache Airflow)
Experience in Data Governance(data governance frameworks, ensuring data integrity, quality control, and security protocols)
Experience in Data Visualization & Reporting - (PowerBI, Tableau, SSIS, SSRS, Superset, Plotly)
Experience in Programming Languages - (: Python, SQL, R, Scala, C, C++, Java)
Experience In Database Technologies - (Teradata, Oracle, SQL Server)
Note : This position requires you to relocate to Portugal - Lisbon (Visa will be provided)
Skills: data security,etl processes,cloud,communication,big data tools,bigquery,analytics,data governance,data visualization & reporting,sql,apache airflow,data visualization,data engineering,data manipulation,communication skills,data governance frameworks,monitoring,data analysis,cloud data solutions,plotly,tableau,data pipeline development,team management,ssrs,relational database design,problem-solving,workflow management,data storage,cloud storage,data governance (data governance frameworks, ensuring data integrity, quality control, and security protocols),agile environments,scala,hive,nosql,c++,technical specifications,data modeling,data pipelines,pandas,cloud data solutions (gcp),cloud functions,data engineering pipelines,google cloud platform (gcp),data engineering technologies,pipelines,etl (extract, transform, load),google cloud platform,ssis,big data tools (hadoop, spark, mapreduce, pig, hive, nosql, apache airflow),python,business intelligence reporting,data visualization & reporting (powerbi, tableau, ssis, ssrs, superset, plotly),hadoop,data quality control,etl,database technologies (teradata, oracle, sql server),agile,security protocols,agile methodology,cloud platforms,programming languages (python, sql, r, scala, c, c++, java),powerbi,spark,mentoring,c,data engineering methodologies,mapreduce,data engineering & architecture,data reporting,business requirements analysis,dataflow,pig,java,project management,reporting,data integrity,gcp,numpy,data,cloud platforms (google cloud platform (gcp), particularly bigquery, cloud storage, cloud functions),r,superset,agile methodologies,data integration