• Data Platform Architect

    Job Locations US-AZ-Gilbert
    Job ID
    Information Technology
  • Position Summary

    The Data Platform Architect role is responsible for solution design focusing on the strategic direction of the organization’s data tier. This individual will work closely with IT leadership, peer Architects and Developers to provide world class solutions supporting a global enterprise infrastructure ensuring scalability, performance and availability of data.

    Essential Duties & Responsibilities (Other duties may be assigned)

    • Provide input on long term strategic vision and direction for the data delivery infrastructure including SSAS implementations, NoSQL and cloud-based computing strategies.
    • Evaluate current data platform and new technology trends and implementations to support future organizational growth and global expansion.
    • Develop proof of concept solutions around data services and storage to support complex challenges specific to organizational structures and requirements.
    • Support ALM/DLM strategies for a world class enterprise systems meeting performance, scalability, and availability of data systems worldwide with limited maintenance and downtime per SLA’s.
    • Providing ongoing technical guidance and direction for the development teams.
    • Work closely with Business Intelligence and Analytics teams to solidify architecture roadmap and implementations including Big Data and advanced analytics.

    Minimum Qualifications (These are the requirements that all applicants MUST HAVE to be considered for this position)

    • 10+ years of database development and data systems architecture experience.
    • Experience with Continuous Integration or Continuous Delivery.
    • Experience working in large-scale, agile development environments.
    • Strong analytical skills and ability to solve complex problems.
    • Ability to identify opportunities and recommend improvements.
    • Ability to deliver, in the face of obstacles and challenges, including fast turnaround times.
    • Ability to effectively work independently as well as in a team.
    • Excellent communication, both verbal and written.
    • Strong understanding of scalable cloud architectures.
    • Familiarity with NoSQL implementations like Cassandra, MongoDB, Hadoop, HDFS.
    • Work outside normal business hours as required for completing projects, working on deployments and resolving system outages.

    Preferred Qualifications

    • Strong knowledge of modern programming languages and frameworks.
    • Strong understanding of Azure or AWS technologies.
    • Hands-on experience in Big Data Components/Frameworks such as Hadoop, Spark, Storm, HBase, HDFS, Pig, Hive, Sqoop, Flume, Ozie, Avro, etc.
    • Proficiency in Java/C++, SQL/RDBM, and data warehousing.


    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed