Sr. Big Data Architect

Seeking a Data Architect who is a visionary in defining and managing Data Architecture with expertise in Data Modelling, Data Marting, ETL, Performance tuning, Data Governance and Data Security leveraging Big Data Technologies, Columnar and Time Series data stores along with traditional RDBMS.¬ RESPONSIBILITIES: Define system level architecture and conduct Dimension Modelling & Data marting Mentor team members through Conceptual and Logical Modelling and drive Physical modelling on the Data Marts Define Data Security protocols and enabling Access controls Conduct Database performance tuning and architect low latency data systems Extensive experience building Master Data Management strategy in an organization. Build highly scalable data marts that can be used by team globally Responsible for maintaining data integrity across multiple data marts Build overall data mart architecture and design and document the data systems ecosystem Data Mapping from sources to the data marts and work with peer data engineering teams to pipeline the data Design and code for highly scalable solutions for Data Extractions from Data Lake and transformations jobs for business rules application Define and parallel process the ETL jobs towards Low Latency and highly scalable systems Architect, detailed design and Code for Data Quality frameworks that can measure and maintain Data Completeness, Data Integrity and Data Validity between interfacing systems Documenting Data Mapping and maintain a data dictionary across all¬ enterprise data Owning the KPIs to measure the performance of Data Marts and provide visibility to senior management Design for Self Serve BI platforms and drive higher adoption rate QUALIFICATIONS: Master's Degree in Computer Science/ Data Science or related Majors Minimum 10 years of industry experience overall 10+ years of Data Warehousing and Data Architecture with 8 + years of Data Modelling and Data Processing for large scale Near Real time Big Data platforms like Redshift,Hbase, Druid, Snowflake 8+ years of Architecting End to End Self Serve BI platforms using BI tools like Tableau, Qlik Sense, Looker or like is preferred 8+ years of ETL knowledge and parallel processing technologies like Spark and Kafka streaming. 5+ years of Programming experience with either Java or Python or C, C++ ¬ in a Linux/ Unix environment Minimum 2 years of Working knowledge with cloud based solutions hosted on AWS Confluence, GitHub, JIRA are other tools and technologies preferred ¬

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.