QualificationsGoC++Data analyticsApacheDistributed systemsDoctoral degreeBachelor’s degreeMaster’s degreeDoctor of Philosophy
Minimum qualifications:
Bachelor’s degree or equivalent practical experience
5 years of experience with software development in one or more programming languages, and with data structures/algorithms
3 years of experience testing, maintaining and/or launching software products, and 1 year of experience with software design and architecture
Experience with Distributed Systems
Preferred qualifications:
Master’s degree or PhD in Computer Science or related technical field
3 years of experience building and developing large-scale infrastructure, distributed systems or networks, and/or experience with compute technologies, storage, and/or hardware architecture
1 year of experience in a technical leadership role
Experience developing accessible technologies
Experience with C++, Python, Java, or Go
Experience with open source data analytics technologies and building large scale services
About the job
Google Cloud Dataflow is a fully-managed service for transforming and enriching data in stream (real time) and batch (historical) modes with equal reliability and expressiveness, no more complex workarounds or compromises needed, and with its serverless approach to resource provisioning and management, you have access to virtually limitless capacity to solve your biggest data processing challenges, while paying only for what you use.
Apache Beam is an open sourced, portable programming model for writing both batch and streaming data processing pipelines.
FlumeJava is a set of libraries and a language runtime system that you can use to create parallel data-processing pipelines.
If you enjoy building distributed systems, data processing systems and customer facing products, this role is for you.
In this role, you will play a significant role in defining the future of Google Cloud Dataflow. The Dataflow Platform space is relatively young and is expected to drive the next wave of Dataflow’s growth. You will also get the opportunity to help both internal and external users run some of the largest data processing jobs in the world.
Google Cloud accelerates organizations’ ability to digitally transform their business with the best infrastructure, platform, industry solutions and expertise. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology – all on the cleanest cloud in the industry. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems.
Responsibilities
Lead projects to build customer facing data processing products (e.g. Data Pipelines)
Work with engineers and experts in different groups in Google Cloud Platform and Google3
Work closely with the Dataflow Site Reliability Engineering team to ensure that Dataflow continues to run as an exceptional service
Mentor junior engineers in the team
Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google’s EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.