Immediate Requirements - C2C & W2 (ADPMN) (ONLY USC & GC) at Chicago, Illinois, USA |
Email: [email protected] |
Hi, Hope you doing well This is Lakshman from ADPMN, we do have multiple job openings for the below roles please read the job description below & kindly share the matched resumes. ROLE 1 :: Job Title: IT Project Manager Location: Chicago, IL(Hybrid)3 days a week Duration: 6 months JOB Description :: Functional is an absolute must. The preference is also someone with technical proficiency in DevOps and/or testing deployment. Minimum 7 to 10 years of excellent project management experience Strong organizational and communication skills, as well as the ability to multitask Communicate with stakeholders to keep them informed and resolve issues Identify and eliminate potential risks and blockers Creating long- and short-term plans, including setting targets for milestones and adhering to deadlines Tracking progress: Monitoring projects to ensure they are on time, within budget, and using the right resources Providing support: Offering administrative and operational support to project teams. Managing documentation and status reporting at multiple levels including executive level. Tracking KPIs: Tracking key performance indicators (KPIs) and forecasting metrics to help with decision making. Prompt updates of multiple moving targets to the Leader on regular basis. Hands on experience on ADO, Share point, Office Tools and other PM tools. ROLE 2: Job Title: Data Pipeline & ETL Migration Specialist (Legacy Platform to Snowflake)(ONLY USC & GC) Location: Remote Duration: 6+ months JOB DESCRIPTION:: Lead the migration of legacy data warehouse services (Oracle, SQL Server) into Snowflake for client-facing solutions, ensuring smooth data extraction, transformation, and loading(ETL). Implement Snowflake Time Travel and Snapshot DBs to replace outdated legacy services, enhancing data backup and recovery capabilities for clients. Design and build automated pipelines to extract, transform, and load (ETL) data from various systems, including manual systems and system of record (SoR). Ensure encryption protocols are implemented where necessary to protect PII and sensitive data during migration and post-migration processes. Support manual data collection processes that involve moving data to S3 buckets, automating them where it is possible to reduce reliance on manual interventions. Collaborate with the network and security teams to address potential security issues, particularly with data grabber solutions used for pulling data from client systems. Optimize data flow architectures and monitoring to support efficient data extraction from legacy systems to Snowflake, ensuring seamless integration and performance. Develop and deploy solutions to automate manual tasks and improve the scalability of data extraction processes for 40+ channels. Troubleshoot issues related to data migration, such as system integration and data compatibility, while working closely with clients. Oversee the integration of security protocols into the network architecture, ensuring encryption standards (SSL/TLS, PGP) are adhered to at every stage of the data flow. Skills & Qualifications: Strong experience in SQL Server to Snowflake migrations, with specific knowledge of Snowflake Time Travel and Snapshot DBs for data recovery. Expertise in building and optimizing ETL pipelines using S3 and Snowflake Tasks (or other DAG tools) for efficient data processing and transformation. Knowledge of data encryption techniques and secure handling of PII within cloud data platforms like Snowflake. Familiarity with legacy systems, particularly with Oracle and client-facing data services, and experience in automating manual data collection processes. Ability to troubleshoot complex data issues related to legacy dependencies, such as ETL tools and reporting systems. Hands-on experience with data security, particularly in addressing potential security issues related to external data pull processes (e.g., data grabber). Excellent collaboration skills to work with both internal teams and clients, ensuring seamless migration and minimal disruptions to ongoing operations. Strong problem-solving skills to resolve data migration challenges, such as system incompatibilities and data flow optimization. In-depth knowledge of data encryption, key management, and secure transmission methods (SFTP, FTPS) in cloud environments. Experience in cloud-native data extraction tools (AWS Glue, Snowflake Tasks) for building and optimizing ETL pipelines. ROLE 3: Job Title: Reporting & Business Intelligence Specialist (Cognos to Power BI)(ONLY USC & GC) Location: Remote Duration: 6+ months JOB DESCRIPTION :: Lead the migration of legacy Cognos reports to modern BI platforms like Power BI, ensuring continuity and minimal disruption during the transition Analyze and optimize Cognos reports for clients, ensuring they meet business requirements and can be effectively migrated to Power BI Design and develop Power BI dashboards and reports that utilize the capabilities of Snowflake for real-time data access and reporting Work with clients to modernize their ETL and reporting stacks, ensuring seamless integration with Snowflake and improved performance Provide ongoing support for migrated reports, ensuring they remain compliant with data governance and security requirements Optimize reporting processes by using Snowflake's PL layer to streamline data queries and improve reporting performance Collaborate with clients to address any dependencies on legacy systems, ensuring that data and reporting flows are not disrupted Develop and implement best practices for BI reporting, focusing on data accuracy, usability, and the ability to generate insights quickly Lead initiatives to modernize reporting infrastructure, moving from static Cognos reports to dynamic, interactive Power BI dashboards Troubleshoot reporting issues, especially related to data integration, and ensure that Snowflake-based reporting solutions are optimized for real-time analytics Skills & Qualification:: Deep knowledge of Cognos and Power BI, with experience migrating legacy reports to modern BI platform Proficiency in using Snowflake for reporting and data visualization, with experience in optimizing data flows from legacy systems to cloud BI tool Strong experience in designing scalable BI solutions that improve reporting performance and user experienc Knowledge of ETL processes and their integration with reporting tools, particularly in environments reliant on Snowflake Familiarity with data governance and compliance standards to ensure secure, accurate reporting ROLE 4: Job Title: Cloud Data Architect (Snowflake Implementation) (ONLY USC & GC) Location: Remote Duration: 6+ months JOB DESCRIPTION :: Architect and design Snowflake-based data warehousing solutions, ensuring the successful migration of legacy Oracle and SQL Server databases to a modern cloud infrastructure. Develop and maintain data flow architectures for multi-tenant systems, including centralized data management solutions for 40+ client channels. Collaborate with internal teams to implement data grabber technology for automating data extraction from client systems, ensuring compliance with security protocols. Build data models and optimize Snowflake environments to enable real-time data sharing, query optimization, and data governance. Lead the integration of legacy ETL processes and reporting tools into Snowflake, ensuring secure, efficient, and compliant data workflows. Implement encryption protocols for PII protection, key management, and secure data sharing across the Snowflake platform. Provide guidance on centralizing client data within a single Snowflake organization, using account partitioning to improve administrative efficiency and simplify data management. Collaborate with security teams to ensure that data governance frameworks are implemented, including monitoring, logging, and auditing requirements. Work with clients to address their data dependencies, transitioning them from manual systems to automated Snowflake-based workflows. Stay updated with the latest Snowflake features and advancements, recommending architecture improvements and new tools to optimize the data infrastructure. Skills & Qualifications: Proven experience in Snowflake architecture, with expertise in Oracle and SQL Server migrations to cloud data platforms. Strong understanding of multi-tenant architectures, including the use of partitioned accounts to centralize and manage client data securely. Knowledge of ETL tools, data grabbers, and legacy data workflows, with experience in optimizing data flows to Snowflake. Expertise in data security, particularly in financial environments where PII encryption, key management, and secure data sharing are critical. Ability to collaborate with security, data engineering, and client teams to ensure the successful implementation of Snowflake-based solutions. ROLE :: 5 (ONLY W2) Job Title :: Scaled Agile Framework (SAFe) Product Owner (Certified) Location :: WASHINGTON, District of Columbia Duration :: 12 Months Required: SAFe 6.0 Product Owner/Product Manager certification. Job Description :: As a Scaled Agile Framework (SAFe) Product Owner for client in Washington, DC, you will be an essential part of an Agile Release Train (ART) team focused on creating a robust self-service and advanced analytics platform. Working closely with Product Management, Architecture, and other stakeholders, youll define, prioritize, and validate solutions that drive effective data governance, data quality, and self-service analytics. Your role will ensure the platform empowers business users to access, manage, and trust data across the organization. Responsibilities: Product Ownership: Serve as the primary customer advocate, managing the product backlog to reflect business priorities and maximize value. Vision and Strategy: Develop and communicate a clear product vision aligned with business goals and user needs. Prioritization: Rank features and user stories based on business value, user feedback, and technical feasibility to focus the team on high-impact items. Requirements Gathering & Feature Development: Collaborate with product management and team members to define features, user stories, and acceptance criteria for a high-quality data management platform. Stakeholder Engagement: Engage business leaders, customers, and development teams to ensure alignment with the product vision. Cross-Team Collaboration: Work with other Product Owners to integrate data governance, metadata management, data quality, and stewardship into platform initiatives. Agile Framework: Apply and promote SAFe principles, ensuring adherence to agile methodologies. Roadmap Development: Contribute to a roadmap outlining product delivery schedules and key milestones. Metrics & Analysis: Track KPIs to assess product success and drive continuous improvement. Leadership & Communication: Provide leadership and clear communication, bridging business and technical teams. Qualifications: Experience in agile methodologies, SAFe framework, and data-driven decision-making Skilled in Jira, Confluence, FigJam, and other agile tools Knowledge in data management, governance, and cloud-based solutions is preferred Best Regards Lakshman. PS I US IT & Bench Sales Recruiter I Adpmn Inc, Contact: +1 678-395-0799; Ext:110 E-mail: [email protected] Address : 1740 Grassland Parkway, unite #305, Alpharetta, GA-30004 www.adpmn.com Linkedin:: https://www.linkedin.com/in/lakshman-ps-84648128b This email is generated using CONREP software. A96769 Keywords: business intelligence sthree information technology green card wtwo procedural language Georgia Illinois Immediate Requirements - C2C & W2 (ADPMN) (ONLY USC & GC) [email protected] |
[email protected] View all |
Fri Nov 01 01:10:00 UTC 2024 |