We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Global Data Engineer

Interface Americas, Inc
relocation assistance, remote work
United States, Georgia, Atlanta
Aug 07, 2025

Interface is a global flooring solutions company and sustainability leader, offering an integrated portfolio of carpet tile and resilient flooring products that includes Interface carpet tile and LVT, nora rubber flooring, and FLOR premium area rugs for commercial and residential spaces. Made with purpose and without compromise, Interface flooring brings more sophisticated design, more performance, more innovation, and more climate progress to interior spaces. A decades-long pioneer in sustainability, Interface remains "all in" on becoming a restorative business. Today, the company is focusing on carbon reductions, not offsets, as it works toward achieving its verified science-based targets by 2030 and its goal to become a carbon negative enterprise by 2040.

The Global Data Engineer will play a key role in designing, developing, and deploying high-performance data pipelines and infrastructure that power the enterprise data platform under the guidance of senior and lead engineers.

Collaborating with Business Intelligence, Infrastructure, Business Analytics, and global IT and business teams, this role will help deliver scalable, production-ready solutions that support advanced analytics, reporting, and data-driven decision-making across the organization. The Data Engineer will contribute to the modernization of our data architecture, ensure high-quality, reliable data delivery, and support the implementation of governed self-service analytics using tools such as Power BI and Microsoft Fabric. This role is ideal for a data professional ready to contribute hands-on to enterprise projects and grow within a technically collaborative environment.

At Interface, we are deeply committed to the professional development of our employees. We believe in nurturing talent, encouraging personal growth, and fostering a culture of continuous learning. To facilitate this, we provide access to LinkedIn Learning, where you can expand your skill set, keep abreast of industry trends, and even explore new areas of interest.

Our work structure is a hybrid model, combining the best of remote and in-office work. You will have the flexibility to work from home, while also benefiting from in-person collaboration three days each week in our Atlanta office. This balance fosters both efficiency and camaraderie, contributing to an empowering and dynamic work environment.

Essentials Functions:

  • Design and develop scalable data models, pipelines, and infrastructure in Azure, driving insights, reporting, mobile/web applications, and machine learning

  • Support in data engineering and data science projects for global Big Data initiatives

  • Develop and automate high-volume, batch and real-time ETL pipelines using Azure Data Factory, Azure SQL Databases, Databricks, and Python

  • Use Microsoft Fabric and Power BI to create impactful semantic data models, dashboards, advanced DAX measures, calculated columns, data transformations and data visualizations

  • Technical expertise in performance tuning, PowerQuery, SQL scripting, row level security (RLS)

  • Deploy backend production services with an emphasis on high availability, robustness, and monitoring

  • Troubleshoot data processing performance issues and data quality problems with guidance from senior engineers

  • Design and execute test plans to validate the accuracy and completeness of data flowing through ETL pipelines and reports

  • Follow Continuous Integration process by committing all code to Version Control repositories

  • Collaborate with cross-functional teams to understand business requirements, create comprehensive test plans, and translate them into technical solutions, reports, and dashboards

  • Collaborate with development teams and solution architects to define infrastructure and deployment requirements for data warehousing and data modeling

  • Implement machine learning models in collaboration with data scientists

  • Work closely with senior data engineers and analysts to learn best practices in data modeling, performance tuning, and stakeholder delivery

  • Continuously increase knowledge of Business Intelligence applications and tools

  • Willingness to learn and understand Interface's commitment to sustainability

  • Performs other duties as assigned

Preferred Skills and Experience:

  • Bachelor's degree in Computer Science, Engineering, Data Science or a related discipline

  • 4+ years of experience in Data Engineering, Software Engineering, Data Science, Machine Learning, and Artificial Intelligence using Snowflake, Azure or AWS cloud technologies

  • 4+ years of experience in Python programming, machine learning, artificial intelligence, system design, data structures, and algorithms in software development and high volume, distributed systems

  • 4+ years of experience in processing and modeling data in Python, SQL, Azure Analysis Services, Azure Data Factory, Databricks, SSAS, Qlik, Power BI, Microsoft Fabric, Tableau with a strong understanding of star and snowflake schemas, OLAP/OLTP and software engineering

  • 3+ years of experience working on an engineering team building out QA practices

  • Strong understanding of SQL and experience with relational databases

  • Strong understanding of data structures, data types, data transformation, and data performance tuning

  • Experience with Python and data transformation and quality check libraries such as PySpark, pandas, and Great Expectations

  • Strong Excel knowledge for validating data (VBA, macros, pivot tables, formulas, etc.)

  • Strong analytical, problem-solving, and debugging skills, with the ability to learn and comprehend business processes quickly

  • Experience with data integration and management tools

  • Knowledge of Power BI or other data visualization such as Tableau

  • Hands-on experience with Azure DevOps, Git, and other CI/CD tools

  • Knowledge of Infrastructure as Code (IaC) and provisioning tools like Terraform, Ansible, Jenkins, or ARM in Azure

  • Experience scripting languages such as PowerShell and JSON or YAML file formats

  • Experience with machine learning is a plus

  • Experience working on cross-functional teams and projects, and effectively communicating with stakeholders at multiple levels

  • Exceptional verbal and written communications skills, with an ability to express complex technical concepts in business terms

  • Solid organizational skills while working on multiple projects and ability to meet deadlines

This is a hybrid position based at our headquarters in Atlanta, GA. Please note that relocation assistance is not available for this role.

#LI-Hybrid

3 - Associate / Professional / Individual Contributor / Team Lead We are a VEVRAA Federal Contractor. We desire priority referrals of Protected Veterans for job openings at all locations within the State of Georgia. An Equal Opportunity Employer including Veterans and Disabled.
Applied = 0

(web-8669549459-7b59w)