*Please note, this role is not able to offer visa transfer or sponsorship now or in the future*
Role Tile: Senior AWS Data Engineer
About the role
Design, build, and operate scalable, cloud‑native data platforms supporting batch and streaming use cases, with strong focus on governance, performance, and reliability.
Responsibilities
- Programming
· Python: Strong hands-on experience with Python for data engineering tasks, including scripting, automation, and ETL logic development
· PySpark: Proficiency in writing and optimizing PySpark jobs for large-scale data transformations
· SQL: Advanced SQL skills for data querying, transformation logic, and stored procedure conversion from SQL Server
- Big Data Processing Frameworks
· Apache Spark: Strong experience with Spark core concepts — RDDs, DataFrames, Datasets, partitioning, and performance tuning
· Data partitioning and optimization: Experience with data skew handling, broadcast joins, caching strategies, and Spark tuning
· AWS Services (Hands-On Experience Required)
- AWS Glue ETL: Developing and deploying Glue jobs (Python Shell and Spark), job bookmarks, dynamic frames, and custom connectors
· AWS Glue Data Catalog: Managing databases, tables, crawlers, classifiers, and schema versioning
· AWS Lake Formation: Configuring data lake permissions, fine-grained access control, and data filtering
· AWS Step Functions: Designing and implementing state machines for ETL workflow orchestration, error handling, and retry logic
· AWS Lambda: Writing serverless functions for event-driven triggers, lightweight transformations, and pipeline utilities
· Amazon Aurora: Working with Aurora PostgreSQL compatible for relational data storage and query optimization
· Amazon DynamoDB: Designing and querying NoSQL tables
· Amazon S3: Proficiency in S3 data lake design — partitioning strategies, storage classes, lifecycle policies, and S3 event notifications
· AWS IAM: Understanding of roles, policies, and least-privilege access patterns relevant to data pipeline security
- ETL Development & Migration
· Informatica PowerCenter (working knowledge): Ability to read and interpret Informatica workflows, sessions, mappings, and transformations to support conversion to AWS Glue
· ETL framework development: Experience building reusable, configurable ETL frameworks with logging, error handling, retry mechanisms, and metadata-driven execution
· Data pipeline design patterns: Familiarity with incremental loads, CDC (Change Data Capture), full loads, and SCD (Slowly Changing Dimensions)
· SQL Server (working knowledge): Ability to understand SQL Server schemas, stored procedures, and SSIS packages for migration analysis
5. Data Engineering Best Practices
· Data quality and validation: Implementing data quality checks, reconciliation logic, and exception handling within pipelines
· Metadata-driven frameworks: Building configurable pipelines driven by metadata stored in Aurora or DynamoDB
· Logging and observability: Integrating CloudWatch logging, custom metrics, and alerting into data pipelines
· Unit and integration testing: Writing test cases for ETL logic using frameworks such as pytest
· Version control: Proficiency with Git for source code management, branching strategies, and code reviews
What you need to have to be considered
· 8+ years in IT related role
· Strong hands on experience in AWS Cloud, SQL and Python
· Good experience with Kafka/Flink, AWS Glue and Airflow
Work model: On Site
At Cognizant, we strive to provide flexibility wherever possible, and we are here to support a healthy work-life balance though our various wellbeing programs. Based on this role’s business requirements, this is an onsite position requiring 5 days a week in a client or Cognizant office in Charlotte, NC.
La Comunidad Cognizant:
Somos un equipo que se aprecia y apoya mutuamente. Nuestros asociados trabajan en un entorno colaborativo e integrador en el que todos pueden prosperar.
- Cognizant es una comunidad global con más de 300.000+ asociados en todo el mundo.
- No solo soñamos con un mundo mejor, sino que trabajamos para hacerlo realidad.
- Cuidamos de nuestros asociados, clientes, socios, comunidades y entorno haciendo lo correcto.
- Fomentamos un entorno innovador en el que puedes desarrollar tu carrera profesional.
Acerca de nosotros
Cognizant (NASDAQ: CTSH) es arquitecto de soluciones de IA y proveedor de servicios tecnológicos que traduce la inversión en inteligencia artificial en valor empresarial real mediante soluciones de IA de stack completo adaptadas a cada cliente. Su profundo conocimiento de la industria, los procesos y la ingeniería le permite integrar el contexto único de cada organización en sistemas tecnológicos que amplifican el potencial humano, generan un impacto real en el negocio y mantienen a las grandes empresas globales un paso adelante en un mundo en constante cambio. Para más información, visita www.cognizant.es o síguenos en @cognizant.
Cognizant es un empleador que ofrece igualdad de oportunidades. Tu solicitud y candidatura no serán consideradas en base a raza, color, sexo, religión, credo, orientación sexual, identidad de género, origen nacional, discapacidad, información genética, embarazo, condición de veterano o cualquier otra característica protegida por las leyes federales, estatales o locales.
Advertencia:
La información sobre remuneración es exacta a la fecha de esta publicación. Cognizant se reserva el derecho a modificar esta información en cualquier momento, sujeto a la legislación aplicable.Es posible que los candidatos deban asistir a entrevistas en persona o por videoconferencia e, incluso, que tengan presentar su documento de identidad vigente, durante cada entrevista.