Immediate Requirements - C2C (ADPMN) (ONLY USC & GC) at Remote, Remote, USA |
Email: [email protected] |
Hi, Hope you doing well This is Lakshman from ADPMN, we do have multiple job openings for the below roles please read the job description below & kindly share the matched resumes. ROLE 1: Job Title: Data Pipeline & ETL Migration Specialist (Legacy Platform to Snowflake)(ONLY USC & GC) Location: Remote Duration: 6+ months JOB DESCRIPTION:: Lead the migration of legacy data warehouse services (Oracle, SQL Server) into Snowflake for client-facing solutions, ensuring smooth data extraction, transformation, and loading(ETL). Implement Snowflake Time Travel and Snapshot DBs to replace outdated legacy services, enhancing data backup and recovery capabilities for clients. Design and build automated pipelines to extract, transform, and load (ETL) data from various systems, including manual systems and system of record (SoR). Ensure encryption protocols are implemented where necessary to protect PII and sensitive data during migration and post-migration processes. Support manual data collection processes that involve moving data to S3 buckets, automating them where it is possible to reduce reliance on manual interventions. Collaborate with the network and security teams to address potential security issues, particularly with data grabber solutions used for pulling data from client systems. Optimize data flow architectures and monitoring to support efficient data extraction from legacy systems to Snowflake, ensuring seamless integration and performance. Develop and deploy solutions to automate manual tasks and improve the scalability of data extraction processes for 40+ channels. Troubleshoot issues related to data migration, such as system integration and data compatibility, while working closely with clients. Oversee the integration of security protocols into the network architecture, ensuring encryption standards (SSL/TLS, PGP) are adhered to at every stage of the data flow. Skills & Qualifications: Strong experience in SQL Server to Snowflake migrations, with specific knowledge of Snowflake Time Travel and Snapshot DBs for data recovery. Expertise in building and optimizing ETL pipelines using S3 and Snowflake Tasks (or other DAG tools) for efficient data processing and transformation. Knowledge of data encryption techniques and secure handling of PII within cloud data platforms like Snowflake. Familiarity with legacy systems, particularly with Oracle and client-facing data services, and experience in automating manual data collection processes. Ability to troubleshoot complex data issues related to legacy dependencies, such as ETL tools and reporting systems. Hands-on experience with data security, particularly in addressing potential security issues related to external data pull processes (e.g., data grabber). Excellent collaboration skills to work with both internal teams and clients, ensuring seamless migration and minimal disruptions to ongoing operations. Strong problem-solving skills to resolve data migration challenges, such as system incompatibilities and data flow optimization. In-depth knowledge of data encryption, key management, and secure transmission methods (SFTP, FTPS) in cloud environments. Experience in cloud-native data extraction tools (AWS Glue, Snowflake Tasks) for building and optimizing ETL pipelines. ROLE 2: Job Title: Reporting & Business Intelligence Specialist (Cognos to Power BI)(ONLY USC & GC) Location: Remote Duration: 6+ months JOB DESCRIPTION :: Lead the migration of legacy Cognos reports to modern BI platforms like Power BI, ensuring continuity and minimal disruption during the transition Analyze and optimize Cognos reports for clients, ensuring they meet business requirements and can be effectively migrated to Power BI Design and develop Power BI dashboards and reports that utilize the capabilities of Snowflake for real-time data access and reporting Work with clients to modernize their ETL and reporting stacks, ensuring seamless integration with Snowflake and improved performance Provide ongoing support for migrated reports, ensuring they remain compliant with data governance and security requirements Optimize reporting processes by using Snowflake's PL layer to streamline data queries and improve reporting performance Collaborate with clients to address any dependencies on legacy systems, ensuring that data and reporting flows are not disrupted Develop and implement best practices for BI reporting, focusing on data accuracy, usability, and the ability to generate insights quickly Lead initiatives to modernize reporting infrastructure, moving from static Cognos reports to dynamic, interactive Power BI dashboards Troubleshoot reporting issues, especially related to data integration, and ensure that Snowflake-based reporting solutions are optimized for real-time analytics Skills & Qualification:: Deep knowledge of Cognos and Power BI, with experience migrating legacy reports to modern BI platform Proficiency in using Snowflake for reporting and data visualization, with experience in optimizing data flows from legacy systems to cloud BI tool Strong experience in designing scalable BI solutions that improve reporting performance and user experienc Knowledge of ETL processes and their integration with reporting tools, particularly in environments reliant on Snowflake Familiarity with data governance and compliance standards to ensure secure, accurate reporting ROLE 3 : Job Title: Cloud Data Architect (Snowflake Implementation) (ONLY USC & GC) Location: Remote Duration: 6+ months JOB DESCRIPTION :: Architect and design Snowflake-based data warehousing solutions, ensuring the successful migration of legacy Oracle and SQL Server databases to a modern cloud infrastructure. Develop and maintain data flow architectures for multi-tenant systems, including centralized data management solutions for 40+ client channels. Collaborate with internal teams to implement data grabber technology for automating data extraction from client systems, ensuring compliance with security protocols. Build data models and optimize Snowflake environments to enable real-time data sharing, query optimization, and data governance. Lead the integration of legacy ETL processes and reporting tools into Snowflake, ensuring secure, efficient, and compliant data workflows. Implement encryption protocols for PII protection, key management, and secure data sharing across the Snowflake platform. Provide guidance on centralizing client data within a single Snowflake organization, using account partitioning to improve administrative efficiency and simplify data management. Collaborate with security teams to ensure that data governance frameworks are implemented, including monitoring, logging, and auditing requirements. Work with clients to address their data dependencies, transitioning them from manual systems to automated Snowflake-based workflows. Stay updated with the latest Snowflake features and advancements, recommending architecture improvements and new tools to optimize the data infrastructure. Skills & Qualifications: Proven experience in Snowflake architecture, with expertise in Oracle and SQL Server migrations to cloud data platforms. Strong understanding of multi-tenant architectures, including the use of partitioned accounts to centralize and manage client data securely. Knowledge of ETL tools, data grabbers, and legacy data workflows, with experience in optimizing data flows to Snowflake. Expertise in data security, particularly in financial environments where PII encryption, key management, and secure data sharing are critical. Ability to collaborate with security, data engineering, and client teams to ensure the successful implementation of Snowflake-based solutions. ROLE ::4 Job Title: UX Designer Duration: - 6 Months Contract Location: Mountain View, CANeed Locals Description: Engineer, Software Contractor Front-End Web Applications Responsibilities: Work as part of a team to design, develop, test, deploy, maintain and improve software Aid in code reviews for fellow team members, as required Create unit tests to help ensure code quality throughout the applications life cycle Analyze and improve efficiency, scalability, and stability of various system resources once deployed Continue to improve code quality by tracking, reducing and avoiding technical debt Required Knowledge and Skills: 3+ years of experience with Angular 9 or above 1+ years of experience working with UX/Figma Significant experience building web-based applications and RESTful APIs Shipping new features in a SPA environment. Thoughtful about creating the right architecture but realizing the realities of having customers and the need to ship software. Understand agile and enjoy working in 2-week release cycles. A can-do attitude and ability to make a positive impact our culture. Ability to always put the customer first Enjoy helping mentor junior engineers. Preferred Knowledge and Skills: 1+ years experience building a shared component library or contributing to a design system 1+ years of experience with Nx Monorepo 1+ years of experience deploying applications in the public cloud using technologies like Azure, AWS, Docker, Kubernetes. 1+ years of experience building event-driven architectures using messaging systems/service bus, such as Kafka or RabbitMQ 1+ years of experience with a microservices architecture 1+ years of public cloud experience Deploying service oriented and microservices architectures Best Regards Lakshman. PS I US IT & Bench Sales Recruiter I Adpmn Inc, Contact: +1 678-395-0799; Ext:110 E-mail: [email protected] Address : 1740 Grassland Parkway, unite #305, Alpharetta, GA-30004 www.adpmn.com Linkedin:: https://www.linkedin.com/in/lakshman-ps-84648128b This email is generated using CONREP software. A96769 Keywords: user experience business intelligence sthree information technology green card procedural language California Georgia Immediate Requirements - C2C (ADPMN) (ONLY USC & GC) [email protected] |
[email protected] View all |
Mon Nov 04 20:43:00 UTC 2024 |