JOB | Data Architect | Remote OK at Remote, Remote, USA |
Email: [email protected] |
From: Manoj Rathee, Sunray Enterprise, Inc. [email protected] Reply to: [email protected] Hi Dear , I hope my mail finds you in good health and doing well! We currently have the JOB POSITION listed below available. Kindly go through the job description and share your latest updated RESUME, visa copy, and photo ID so that I can submit the profile to the client . Job Position :- Data Architect Locations :- PA, USA Remote OK Duration :-Long Term Job Description:- PySpark, ADF, PowerBI, Azure Potential Skillset : PySpark, ADF and PowerBI Job Summary: Experience as Data Architect with expertise in PySpark, Azure Data Factory (ADF), and Power BI to design and implement scalable data solutions that meet business requirements. The ideal candidate will have a strong background in data engineering, data integration, and data visualization, with a focus on creating efficient, reliable, and secure data pipelines and reporting solutions. Key Responsibilities: Data Architecture Design: Design and develop scalable, high-performance data architecture solutions using PySpark, ADF, and Power BI to support business intelligence, analytics, and reporting needs. Data Pipeline Development: Build and manage robust data pipelines using PySpark and Azure Data Factory, ensuring efficient data extraction, transformation, and loading (ETL) processes across various data sources. Data Modeling: Develop and maintain data models that optimize query performance and support the needs of analytics and reporting teams. Integration and Automation: Design and implement integration strategies to automate data flows between systems and ensure data consistency and accuracy. Collaboration: Work closely with data engineers, data analysts, business intelligence teams, and other stakeholders to understand data requirements and deliver effective solutions. Data Governance and Security: Ensure data solutions adhere to best practices in data governance, security, and compliance, including data privacy regulations and policies. Performance Optimization: Continuously monitor and optimize data processes and architectures for performance, scalability, and cost-efficiency. Reporting Visualization: Utilize Power BI to design and develop interactive dashboards and reports that provide actionable insights for business stakeholders. Documentation: Create comprehensive documentation for data architecture, data flows, ETL processes, and reporting solutions. Troubleshooting Support: Provide technical support and troubleshooting for data-related issues, ensuring timely resolution and minimal impact on business operations. Qualifications: Education: Bachelor s degree in Computer Science, Information Technology, Data Science, or a related field. Experience: Minimum of 15+ years of experience in data architecture and engineering, with a focus on PySpark, ADF, and Power BI. Proven experience in designing and implementing data pipelines, ETL processes, and data integration solutions. Strong experience in data modeling and data warehouse design. Technical Skills: Proficiency in PySpark for big data processing and transformation. Extensive experience with Azure Data Factory (ADF) for data orchestration and ETL workflows. Strong expertise in Power BI for data visualization, dashboard creation, and reporting. Knowledge of Azure services (e.g., Azure Data Lake, Azure Synapse) and other relevant cloud-based data technologies. Strong SQL skills and experience with relational databases. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams. Ability to manage multiple priorities in a fast-paced environment. Preferred Qualifications: Certifications: Microsoft certifications related to Azure, Power BI, or data engineering are a plus. Experience: Experience in a similar role within a large enterprise environment is preferred. (1.) To be responsible for providing technical guidance to a team of developers, enhancing their technical capabilities and increasing productivity Hope to hear from you soon !!! Keywords: business intelligence golang Idaho Pennsylvania JOB | Data Architect | Remote OK [email protected] |
[email protected] View all |
Wed Oct 16 20:08:00 UTC 2024 |