Data test engineer (middle +, senior)
Leading free streaming TV service from USA
№9971436, 4 martie 2021
Oraș:
Chișinău
Studii:
Orice
Experiența de munca:
De la 2 pîna la 5 ani
Salariu:
Nespecificat
Program de muncă:
Full-time
Project industry: online free television.
It is a leading free streaming TV service from USA, delivering 250+ live and original channels and thousands of on-demand movies in partnership with major TV networks, movie studios, publishers, and digital media companies.
End Customer Timezone: PT
Our Customer Timezone: Belarus (team will work in Belarusian timezone)
Our current team: 48 people; many different departments; mobile, frontend, backend, all sorts of services, testers, etc.
Responsibilities:
● Create data integration/engineering workflows on Big Data;
● Design the best solutions for data integration/streaming;
● Develop manual and automated test cases to validate data integration
solutions using Java according to data pipeline business requirements;
● Сheck apps developed on data, data warehousing & AWS Redshift,
Snowflake or columnar databases;
● Verify capturing of analytics events in related file systems or databases;
● Work with BI and PM in developing test strategies, plans and cases from
event creation to reporting;
● Cooperate with project development teams implementing analytics
features into client applications;
● Agile Software Delivery methodology;
● Risk assessment.
Requirements:
● 5+ years of QA experience;
● 3+ years of:
o Data Quality experience/SDET experience with a focus on data, data warehousing, reporting, etc.;
o Agile experience;
● Experience with:
o Java (test automation experience);
o Python, Java, etc. (programming experience);
o SQL;
o Amazon Web Services;
o Performance Test Design, Development and load testing execution;
o reporting or analytics tools like Tableau or Mode;
o analytics implementations in a web or mobile apps.
● ApacheJMeter, LoadRunner or similar tools;
● Knowledge of:
o JVM, Spring Boot, data warehouse, data integration, sql server, apache kafka, data streaming, Big Data, mongoDB, SQL, Web Services, microservices, ETL, change data capture (CDC), DevOps;
o AWS Redshift, Snowflake, or columnar databases;
● Speaking English B2+
Project duration: 1 year +
It is a leading free streaming TV service from USA, delivering 250+ live and original channels and thousands of on-demand movies in partnership with major TV networks, movie studios, publishers, and digital media companies.
End Customer Timezone: PT
Our Customer Timezone: Belarus (team will work in Belarusian timezone)
Our current team: 48 people; many different departments; mobile, frontend, backend, all sorts of services, testers, etc.
Responsibilities:
● Create data integration/engineering workflows on Big Data;
● Design the best solutions for data integration/streaming;
● Develop manual and automated test cases to validate data integration
solutions using Java according to data pipeline business requirements;
● Сheck apps developed on data, data warehousing & AWS Redshift,
Snowflake or columnar databases;
● Verify capturing of analytics events in related file systems or databases;
● Work with BI and PM in developing test strategies, plans and cases from
event creation to reporting;
● Cooperate with project development teams implementing analytics
features into client applications;
● Agile Software Delivery methodology;
● Risk assessment.
Requirements:
● 5+ years of QA experience;
● 3+ years of:
o Data Quality experience/SDET experience with a focus on data, data warehousing, reporting, etc.;
o Agile experience;
● Experience with:
o Java (test automation experience);
o Python, Java, etc. (programming experience);
o SQL;
o Amazon Web Services;
o Performance Test Design, Development and load testing execution;
o reporting or analytics tools like Tableau or Mode;
o analytics implementations in a web or mobile apps.
● ApacheJMeter, LoadRunner or similar tools;
● Knowledge of:
o JVM, Spring Boot, data warehouse, data integration, sql server, apache kafka, data streaming, Big Data, mongoDB, SQL, Web Services, microservices, ETL, change data capture (CDC), DevOps;
o AWS Redshift, Snowflake, or columnar databases;
● Speaking English B2+
Project duration: 1 year +
CV-ul a fost trimis cu succes! 🥳
Înregistrează-te acum, pentru a monitoriza statusul CV-ului tău!
Primit
și
Văzut