Senior Cloud Engineer


Office

North America Region

Job Functions

Engineering

Corelight is an early-stage cybersecurity startup based in downtown San Francisco, with offices in Columbus, OH and Santa Clara, CA. We secure some of the most sensitive and mission-critical networks in the world, and our rapidly-growing customer base includes eight of the Fortune 50.  We help our customers know their networks better than anyone else, delivering insight that dramatically shifts the way that SOCs react to, and defend against, attacks. Corelight provides vital assistance at every phase, from alert to response and beyond, through rich, structured, pivotable data.

Our culture is open and accepting. We incorporate data, ideas, and evidence from anywhere, and use them to drive success. We take thoughtful risks and build on each other’s work to uncover the best ideas. We serve our customers, the open-source community, and each other. 

Be on the leading edge of applying machine learning and data modeling to the world of cyber security. Corelight is looking for Senior Data Engineers with Java, Scala, Python experience using Apache Spark in large scale data pipelines. 

Specific responsibilities include:

  • Design and develop data infrastructure systems that reliably ingests and transforms  billions of events per day from different sources and customers at scale.
  • Provide real-time streaming and querying capability on a large volume of data at scale in a multi-tenant SaaS environment
  • Guide our data scientists, and engineers to build and optimize data pipelines to build prototypes, incorporate cutting edge Machine Learning Algorithms to detect security threats.
  • Work with Product and Engineering teams to design new solutions, develop prototypes and deploy to production.
  • Mentor junior members of the development team and grow into technical leadership roles.

Required experience:

  • 5 years' experience building large-scale data pipelines with big data processing and messaging tools like Apache Spark, Kafka, and NoSQL databases such as Cassandra,  DynamoDB or similar.
  • 3 years Software development experience developing applications using public cloud services. Experience and Working knowledge of services offered by AWS, Azure and GCP with expertise in at least one of them.
  • Experience with building and deploying containerized applications using Docker and Orchestration using Kubernetes.
  • Understanding of Machine Learning techniques and the ML model lifecycle, open source libraries like scikit-learn, Tensorflow
  • Experience with open source tools like Elastic, Grafana, Prometheus, Airflow and Monitoring Services like NewRelic, Datadog is a plus.
  • Passion and Curiosity to learn, prototype and deploy new technologies, and solve hard engineering problems.
  • Team player with excellent communication skills  

A note on experience

We understand that no candidate is perfectly qualified for any job. Experience comes in different forms; many skills are transferable; and passion goes a long way. Even more important than your resume is a clear demonstration of dedication, impact, and the ability to thrive in a fluid and collaborative environment. We want you to learn new things in this role, and we encourage you to apply if your experience is close to what we’re looking for.

We also know that diversity of background and thought makes for better problem solving and more creative thinking, which is why we're dedicated to adding new perspectives to the team.

Working at Corelight

In addition to helping to make networks safer around the world, Corelight is a great place to work. We provide competitive salaries, equity and benefits, but those are just table stakes. No matter where you're based, we aspire to make working here one the best experiences of your career.

 

 

Apply for this job

Click to add file Click to paste
EU compliant

This form collects your name, email, and company information so we may contact you regarding your application. See our privacy policy for information on the management and protection of your data.

* Required