If you are a current DSV employee and interested in a position in another country, please contact your Human Resource representative to discuss the process and requirements of applying.
PLEASE REMEMBER TO CLICK \"APPLY\" BUTTON AFTER SAVING YOUR PROFILE TO COMPLETE YOUR APPLICATION PROCESS.
\n\n
\n\n
Responsibilities
\n\n
\n
Define architectural principles, design patterns and coding standards to ensure consistency, maintainability, and reusability of IT solutions.
\n
Assess and evaluate various technologies, platforms, frameworks, and tools to determine their suitability for the organization's IT landscape.
\n
Consider factors such as performance, scalability, security, compatibility, and cost-effectiveness. Make recommendations and guide the selection of appropriate technologies.
\n
Establish and enforce technology standards, guidelines, and best practices across the organization.
\n
Design end-to-end solutions that align with business requirements and leverage Kafka, Kafka Streams, KSQL, MongoDB, PostgreSQL, Kubernetes.
\n
Collaborates with various stakeholders including product owners, platform product owners, developers and devops teams.
\n
Apply expertise in software design patterns to develop scalable and maintainable solutions.
\n
Demonstrate a comprehensive understanding of peripheral subjects such as data governance, ensuring data quality, availability, and security.
\n
Integrate security practices seamlessly into the development and operations lifecycle, fostering a DevSecOps culture.
\n
\n\n
\n\n
Key Experience
\n\n
Most important
\n\n
\n
Fluent in Java, Python
\n
At least 7 years of production experience in Kafka in Complex Event processing systems (telecom, NW, banking transactions, NFT)
\n
Demonstrate proficiency in Kafka and Kafka Streams for building real-time data processing applications.
\n
Developing Stream processing analytics using Kafka Streams API, developing with Processor API, Transformers, Mappers, ability to use Headers, develop custom SerDes, offsets control, customizing threads control in KStream, exception handling.
\n
At least 7 years of developing JSON schemas, Avro, and respective usage/development of producer/consumer and KStream application using schemas.
\n
Hands-on experience with Kubernetes.
\n
Knowledge of MongoDB, PostgreSQL.
\n
Experience running and development for Big Data solutions (e.g., Hadoop, Flink, Spark, AWS Kinesis)
\n
Ability to integrate different events streaming platforms, to aligning synchronous and asynchronous nature of processes, experience in events latching and queue control.
\n
\n\n
\n\n
Fairly important
\n\n
\n
Implement and optimize observability solutions for monitoring and troubleshooting.
\n
Ability to write high-quality, maintainable code in Java for proof of concept and prototype development.
\n
Collaborate with development teams to provide guidance on implementation and ensure adherence to architectural standards.
\n
Excellent problem-solving skills and the ability to navigate complex technical challenges.
\n
Knowledge of data governance principles, ensuring data quality, availability, and integrity.
\n
\n\n
\n\n
Least important
\n\n
\n
Experience with Azure Cloud services and understanding of cloud-native application development and deployment.
\n
Background in designing, developing, and maintaining data platforms.
\n
\n\n
\n\n
\n\n
What we offer:
\n\n
\n
Employment contract
\n
Private medical care
\n
Comprehensive onboarding program
\n
Buddy
\n
Work-life Harmony
\n
Modern eco-office
\n
Canteen
\n
Comfortable ergonomic office
\n
Scandinavian work culture
\n
Internal training catalogue
\n
Culture of feedback
\n
Internal transition program
\n
Holiday gifts
\n
Sport groups
\n
Bike parking
\n
\n
DSV – Global transport and logistics
\n\n
DSV is a dynamic workplace that fosters inclusivity and diversity. We conduct our business with integrity, respecting different cultures and the dignity and rights of individuals. When you join DSV, you are working for one of the very best performing companies in the transport and logistics industry. You’ll join a talented team of more than 75,000 employees in over 80 countries, working passionately to deliver great customer experiences and high-quality services. DSV aspires to lead the way towards a more sustainable future for our industry and are committed to trading on nature’s terms.
\n\n
We promote collaboration and transparency and strive to attract, motivate and retain talented people in a culture of respect. If you are driven, talented and wish to be part of a progressive and versatile organisation, we’ll support you and your need to achieve your potential and forward your career.
We do not save login credentials to your LinkedIn account.
\n
You will be joining our Talent Community using your social profile.
\n
Please ensure that pop-up blockers are disabled in order to proceed.">\n \n \n
\n
\n
\n \n \n
\n
\n\n
\n\n \n \n \n
\n\n
\n
\n Please wait... \n
\n
\n
\n \n \n \n \n \n \n \n \n \n
Job Req Number: 73540 \n\n
Time Type:
\n
\n\n
PLEASE REMEMBER TO CLICK \"APPLY\" BUTTON AFTER SAVING YOUR PROFILE TO COMPLETE YOUR APPLICATION PROCESS.
\n\n
\n\n
Responsibilities
\n\n
\n
Define architectural principles, design patterns and coding standards to ensure consistency, maintainability, and reusability of IT solutions.
\n
Assess and evaluate various technologies, platforms, frameworks, and tools to determine their suitability for the organization's IT landscape.
\n
Consider factors such as performance, scalability, security, compatibility, and cost-effectiveness. Make recommendations and guide the selection of appropriate technologies.
\n
Establish and enforce technology standards, guidelines, and best practices across the organization.
\n
Design end-to-end solutions that align with business requirements and leverage Kafka, Kafka Streams, KSQL, MongoDB, PostgreSQL, Kubernetes.
\n
Collaborates with various stakeholders including product owners, platform product owners, developers and devops teams.
\n
Apply expertise in software design patterns to develop scalable and maintainable solutions.
\n
Demonstrate a comprehensive understanding of peripheral subjects such as data governance, ensuring data quality, availability, and security.
\n
Integrate security practices seamlessly into the development and operations lifecycle, fostering a DevSecOps culture.
\n
\n\n
\n\n
Key Experience
\n\n
Most important
\n\n
\n
Fluent in Java, Python
\n
At least 7 years of production experience in Kafka in Complex Event processing systems (telecom, NW, banking transactions, NFT)
\n
Demonstrate proficiency in Kafka and Kafka Streams for building real-time data processing applications.
\n
Developing Stream processing analytics using Kafka Streams API, developing with Processor API, Transformers, Mappers, ability to use Headers, develop custom SerDes, offsets control, customizing threads control in KStream, exception handling.
\n
At least 7 years of developing JSON schemas, Avro, and respective usage/development of producer/consumer and KStream application using schemas.
\n
Hands-on experience with Kubernetes.
\n
Knowledge of MongoDB, PostgreSQL.
\n
Experience running and development for Big Data solutions (e.g., Hadoop, Flink, Spark, AWS Kinesis)
\n
Ability to integrate different events streaming platforms, to aligning synchronous and asynchronous nature of processes, experience in events latching and queue control.
\n
\n\n
\n\n
Fairly important
\n\n
\n
Implement and optimize observability solutions for monitoring and troubleshooting.
\n
Ability to write high-quality, maintainable code in Java for proof of concept and prototype development.
\n
Collaborate with development teams to provide guidance on implementation and ensure adherence to architectural standards.
\n
Excellent problem-solving skills and the ability to navigate complex technical challenges.
\n
Knowledge of data governance principles, ensuring data quality, availability, and integrity.
\n
\n\n
\n\n
Least important
\n\n
\n
Experience with Azure Cloud services and understanding of cloud-native application development and deployment.
\n
Background in designing, developing, and maintaining data platforms.
\n
\n\n
\n\n
\n\n
What we offer:
\n\n
\n
Employment contract
\n
Private medical care
\n
Comprehensive onboarding program
\n
Buddy
\n
Work-life Harmony
\n
Modern eco-office
\n
Canteen
\n
Comfortable ergonomic office
\n
Scandinavian work culture
\n
Internal training catalogue
\n
Culture of feedback
\n
Internal transition program
\n
Holiday gifts
\n
Sport groups
\n
Bike parking
\n
\n
DSV – Global transport and logistics
\n\n
DSV is a dynamic workplace that fosters inclusivity and diversity. We conduct our business with integrity, respecting different cultures and the dignity and rights of individuals. When you join DSV, you are working for one of the very best performing companies in the transport and logistics industry. You’ll join a talented team of more than 75,000 employees in over 80 countries, working passionately to deliver great customer experiences and high-quality services. DSV aspires to lead the way towards a more sustainable future for our industry and are committed to trading on nature’s terms.
\n\n
We promote collaboration and transparency and strive to attract, motivate and retain talented people in a culture of respect. If you are driven, talented and wish to be part of a progressive and versatile organisation, we’ll support you and your need to achieve your potential and forward your career.
If you are a current DSV employee and interested in a position in another country, please contact your Human Resource representative to discuss the process and requirements of applying.
Architects - Solution
Job Req Number: 73540
Time Type:
PLEASE REMEMBER TO CLICK "APPLY" BUTTON AFTER SAVING YOUR PROFILE TO COMPLETE YOUR APPLICATION PROCESS.
Responsibilities
Define architectural principles, design patterns and coding standards to ensure consistency, maintainability, and reusability of IT solutions.
Assess and evaluate various technologies, platforms, frameworks, and tools to determine their suitability for the organization's IT landscape.
Consider factors such as performance, scalability, security, compatibility, and cost-effectiveness. Make recommendations and guide the selection of appropriate technologies.
Establish and enforce technology standards, guidelines, and best practices across the organization.
Design end-to-end solutions that align with business requirements and leverage Kafka, Kafka Streams, KSQL, MongoDB, PostgreSQL, Kubernetes.
Collaborates with various stakeholders including product owners, platform product owners, developers and devops teams.
Apply expertise in software design patterns to develop scalable and maintainable solutions.
Demonstrate a comprehensive understanding of peripheral subjects such as data governance, ensuring data quality, availability, and security.
Integrate security practices seamlessly into the development and operations lifecycle, fostering a DevSecOps culture.
Key Experience
Most important
Fluent in Java, Python
At least 7 years of production experience in Kafka in Complex Event processing systems (telecom, NW, banking transactions, NFT)
Demonstrate proficiency in Kafka and Kafka Streams for building real-time data processing applications.
Developing Stream processing analytics using Kafka Streams API, developing with Processor API, Transformers, Mappers, ability to use Headers, develop custom SerDes, offsets control, customizing threads control in KStream, exception handling.
At least 7 years of developing JSON schemas, Avro, and respective usage/development of producer/consumer and KStream application using schemas.
Hands-on experience with Kubernetes.
Knowledge of MongoDB, PostgreSQL.
Experience running and development for Big Data solutions (e.g., Hadoop, Flink, Spark, AWS Kinesis)
Ability to integrate different events streaming platforms, to aligning synchronous and asynchronous nature of processes, experience in events latching and queue control.
Fairly important
Implement and optimize observability solutions for monitoring and troubleshooting.
Ability to write high-quality, maintainable code in Java for proof of concept and prototype development.
Collaborate with development teams to provide guidance on implementation and ensure adherence to architectural standards.
Excellent problem-solving skills and the ability to navigate complex technical challenges.
Knowledge of data governance principles, ensuring data quality, availability, and integrity.
Least important
Experience with Azure Cloud services and understanding of cloud-native application development and deployment.
Background in designing, developing, and maintaining data platforms.
What we offer:
Employment contract
Private medical care
Comprehensive onboarding program
Buddy
Work-life Harmony
Modern eco-office
Canteen
Comfortable ergonomic office
Scandinavian work culture
Internal training catalogue
Culture of feedback
Internal transition program
Holiday gifts
Sport groups
Bike parking
DSV – Global transport and logistics
DSV is a dynamic workplace that fosters inclusivity and diversity. We conduct our business with integrity, respecting different cultures and the dignity and rights of individuals. When you join DSV, you are working for one of the very best performing companies in the transport and logistics industry. You’ll join a talented team of more than 75,000 employees in over 80 countries, working passionately to deliver great customer experiences and high-quality services. DSV aspires to lead the way towards a more sustainable future for our industry and are committed to trading on nature’s terms.
We promote collaboration and transparency and strive to attract, motivate and retain talented people in a culture of respect. If you are driven, talented and wish to be part of a progressive and versatile organisation, we’ll support you and your need to achieve your potential and forward your career.
PLEASE REMEMBER TO CLICK "APPLY" BUTTON AFTER SAVING YOUR PROFILE TO COMPLETE YOUR APPLICATION PROCESS.
Responsibilities
Define architectural principles, design patterns and coding standards to ensure consistency, maintainability, and reusability of IT solutions.
Assess and evaluate various technologies, platforms, frameworks, and tools to determine their suitability for the organization's IT landscape.
Consider factors such as performance, scalability, security, compatibility, and cost-effectiveness. Make recommendations and guide the selection of appropriate technologies.
Establish and enforce technology standards, guidelines, and best practices across the organization.
Design end-to-end solutions that align with business requirements and leverage Kafka, Kafka Streams, KSQL, MongoDB, PostgreSQL, Kubernetes.
Collaborates with various stakeholders including product owners, platform product owners, developers and devops teams.
Apply expertise in software design patterns to develop scalable and maintainable solutions.
Demonstrate a comprehensive understanding of peripheral subjects such as data governance, ensuring data quality, availability, and security.
Integrate security practices seamlessly into the development and operations lifecycle, fostering a DevSecOps culture.
Key Experience
Most important
Fluent in Java, Python
At least 7 years of production experience in Kafka in Complex Event processing systems (telecom, NW, banking transactions, NFT)
Demonstrate proficiency in Kafka and Kafka Streams for building real-time data processing applications.
Developing Stream processing analytics using Kafka Streams API, developing with Processor API, Transformers, Mappers, ability to use Headers, develop custom SerDes, offsets control, customizing threads control in KStream, exception handling.
At least 7 years of developing JSON schemas, Avro, and respective usage/development of producer/consumer and KStream application using schemas.
Hands-on experience with Kubernetes.
Knowledge of MongoDB, PostgreSQL.
Experience running and development for Big Data solutions (e.g., Hadoop, Flink, Spark, AWS Kinesis)
Ability to integrate different events streaming platforms, to aligning synchronous and asynchronous nature of processes, experience in events latching and queue control.
Fairly important
Implement and optimize observability solutions for monitoring and troubleshooting.
Ability to write high-quality, maintainable code in Java for proof of concept and prototype development.
Collaborate with development teams to provide guidance on implementation and ensure adherence to architectural standards.
Excellent problem-solving skills and the ability to navigate complex technical challenges.
Knowledge of data governance principles, ensuring data quality, availability, and integrity.
Least important
Experience with Azure Cloud services and understanding of cloud-native application development and deployment.
Background in designing, developing, and maintaining data platforms.
What we offer:
Employment contract
Private medical care
Comprehensive onboarding program
Buddy
Work-life Harmony
Modern eco-office
Canteen
Comfortable ergonomic office
Scandinavian work culture
Internal training catalogue
Culture of feedback
Internal transition program
Holiday gifts
Sport groups
Bike parking
DSV – Global transport and logistics
DSV is a dynamic workplace that fosters inclusivity and diversity. We conduct our business with integrity, respecting different cultures and the dignity and rights of individuals. When you join DSV, you are working for one of the very best performing companies in the transport and logistics industry. You’ll join a talented team of more than 75,000 employees in over 80 countries, working passionately to deliver great customer experiences and high-quality services. DSV aspires to lead the way towards a more sustainable future for our industry and are committed to trading on nature’s terms.
We promote collaboration and transparency and strive to attract, motivate and retain talented people in a culture of respect. If you are driven, talented and wish to be part of a progressive and versatile organisation, we’ll support you and your need to achieve your potential and forward your career.