Databricks

Solutions Architect

To accelerate innovation for its customers by unifying Data Science, Engineering and Business

  • Software engineering

  • Full-time

  • Office | Sydney, NSW, Australia

  • Visa sponsorship · No

  • Mid Level · A role for someone with some well-developed knowledge and skills they can bring to the role and team. Typically within 2-5 years of experience.

  • ·

Why Databricks

Today, more than 7,000 organizations worldwide — including ABN AMRO, Condé Nast, H&M Group, Regeneron and Shell — rely on Databricks to enable massive-scale data engineering, collaborative data science, full-lifecycle machine learning and business analytics.

Headquartered in San Francisco, with offices around the world and hundreds of global partners, including Microsoft, Amazon, Tableau, Informatica, Cap Gemini and Booz Allen Hamilton, Databricks is on a mission to simplify and democratize data and AI, helping data teams solve the world’s toughest problems.

About the role

Data Intelligence Platform, helping data teams complete projects and integrate our platform into their enterprise Ecosystem. You'll grow as a leader in your field, while finding solutions to our customers' biggest challenges in big data, analytics, data engineering and data science problems. You will report to the Field Engineering Manager of your assigned territory and segment.

The impact you will have:

  • You will be a Big Data expert on aspects of architecture and design on public cloud
  • Lead your prospects through evaluating and adopting Databricks
  • Support your customers by authoring reference architectures, how-tos, and demo applications
  • Integrate Databricks with 3rd-party applications to support customer architectures
  • Engage with the technical community by leading workshops, seminars and meet-ups
  • Together with your Account Executive, you will form successful relationships with clients throughout your assigned territory to provide technical and business value

What we look for:

  • Pre-sales or post-sales experience working with external clients across a variety of industry markets
  • Understanding of customer-facing pre-sales or consulting role with a core strength in Data Engineering and Cloud Architect advantageous
  • Experience demonstrating technical concepts, including presenting and whiteboarding
  • Experience designing and implementing architectures within public clouds (AWS, Azure or GCP)
  • Experience with Big Data technologies, including Apache Spark™ , AI, Data Science, Data Engineering, Hadoop, Cassandra, and others.
  • Fluent coding experience in Python or Scala implementing Apache Spark™, Java
  • Experience with Data Science and related technologies (R and various algorithms) would be a plus

What you'll be responsible for

  • 🔀

    Technology Solution Design and Development

    Design and develop customized technology solutions, such as software applications, databases, networks, and platforms

  • 🖥

    Technology Implementation

    Manage technology implementation projects, including budget, timeline, and resources

  • 💬

    Client Communication and Relationship Building

    Communicate with clients and stakeholders to build relationships, gather feedback, and ensure satisfaction with services

Skills you'll need

  • 🤔

    Decision Making

    Considers the costs and benefits of potential actions and determines the most appropriate one

  • 💭

    Critical thinking

    Identifies and synthesizes patterns and trends amongst various sources of information to reach a meaningful conclusion, perspective or insight

  • 💡

    Problem solving

    Identifies problems and develops logical solutions that address the problems

Meet the team

Avatar
Engineering

Databricks

Our global engineering team develops and operates one of the largest-scale software platforms. With global customers in every vertical — from healthcare to software — our team collaborates and innovates together to solve multicloud, distributed systems, data visualization and infrastructure challenges. Join our team to build and evolve our product — together, we can revolutionize the future of data and AI.