Sign up / Log in
Close

Book a demo!

Learn more about Jobful products & services for disrupting talent acquisition!

Discover how Jobful can help you recruit faster and hire top talent for your organization. Book a session with us to:

  • Review the state of hiring and discuss your goals
  • Explore the right tools for a candidate smart experience
  • Gain recruiters operational excellence
  • Receive pricing details based on your needs

Our experts are waiting for you! We’ll reach out within 24 business hours.

Senior Software Engineer
1 year ago
  • Domain IT Software
  • Availability Full-time
  • Experience Senior Level
  • Type of contract Indeterminate term
  • Location Budapest
  • Accommodation No
  • Salary To be determined
  • Verified company Yes

Genesys is building the data platform of the future with a small team with a startup feel and the financial stability of an industry leader. The real-time analytics team uses the latest Flink and Druid versions to process event streams with hundreds of MILLIONS of events per DAY. Our workload is constantly evolving as we keep growing at an exponential rate and new features are added. Analytics is a key part of our platform powering our own services and customer analytics.

Couple that with the flexibility to work remote anywhere in Hungary, this a great opportunity.

 

In this role, you’ll partner with software engineers, product managers, and data scientists to build and support a variety of analytical big data products. The best person will have a strong engineering background, not shy from the unknown, and will be able to articulate vague requirements into something real. We are a team whose focus is to operationalize big data products and curate high-value datasets for the wider organization as well as to build tools and services to expand the scope of and improve the reliability of the data platform as our usage continues to grow daily.


You will:

  • Develop and deploy highly available, fault-tolerant software that will help drive improvements towards the features, reliability, performance, and efficiency of the Genesys Cloud Analytics platform. 
  • Actively review code, mentor, and provide peer feedback. 
  • Collaborate with engineering teams to identify and resolve pain points as well as evangelize best practices. 
  • Partner with various teams to transform concepts into requirements and requirements into services and tools. 
  • Engineer efficient, adaptable and scalable architecture for all stages of data lifecycle (ingest, streaming, structured and unstructured storage, search, aggregation) in support of a variety of data applications. 
  • Build abstractions and re-usable developer tooling to allow other engineers to quickly build streaming/batch self-service pipelines. 
  • Build, deploy, maintain, and automate large global deployments in AWS. 
  • Troubleshoot production issues and come up with solutions as required. 


This may be the perfect job for you if: 

  • You have a strong engineering background with ability to design software systems from the ground up. 
  • You have expertise in Java.
  • You have experience in web-scale data and large-scale distributed systems, ideally on cloud infrastructure. 
  • You have a product mindset. You are energized by building things that will be heavily used. 
  • You have engineered scalable software using big data technologies (e.g. Hadoop, Spark, Hive, Presto, Flink, Samza, Storm, Elasticsearch, Druid, Cassandra, etc). 
  • You have experience building data pipelines (real-time or batch) on large complex datasets. (huge plus)
  • You have worked on and understand messaging/queueing/stream processing systems. (huge plus)
  • You design not just with a mind for solving a problem, but also with maintainability, testability, monitorability, and automation as top concerns. 


Technologies we use and practices we hold dear: 

  • Right tool for the right job over we-always-did-it-this-way. 
  • We pick the language and frameworks best suited for specific problems. This usually translates to Java for developing services and applications and Python for tooling. 
  • Packer and ansible for immutable machine images. 
  • AWS for cloud infrastructure. 
  • Infrastructure (and everything, really) as code. 
  • Automation for everything. CI/CD, testing, scaling, healing, etc. 
  • Flink and Kafka for stream processing. 
  • Hadoop, Hive, and Spark for batch. 
  • Airflow for orchestration. 
  • Druid, Dynamo, Elasticsearch, Presto, and S3 for query and storage


APPLY

Values
Perks
Recruitment process
Close
Apply to job
Stay safe in your search for employment.
  • Ask questions about the job before you go to an interview
  • Don’t leave your original passport and ID to employers
  • Don’t make any requested payments
  • Research the recruiter and the company
  • Read the contract before you sign it
Read more about how to recognise legitimate employment here.
Close
Apply to job

Do you want to apply to this job? Upload your CV and we will use it to create your Jobful profile.

Pro tip: Any format is good but we recommend the resume export from LinkedIn

pdf, doc, docx, txt, rtf (5MB)


Close
Login
Close
Login as a company
Close
Register as a candidate
Already have an account? .

Alternately you can

or
Close
Register as a company
Already have an account? .
Close
Retrieve password
Close
Register as a recruiter
Already have an account? .
Close
Retrieve password