Home

Hiring - Windchill Developer || Certified GCP Data Engineer || || Data Architect at Remote, Remote, USA
Email: [email protected]
From:

Ashwini Sithar,

Propelsys

[email protected]

Reply to:   [email protected]

Hi

Hope you are doing well

We have urgent  requirements  with our Implementation partner for the below requirement

Job Title - Sr. Windchill Developer

Location - Remote (PST Time zone)

Rate - Open (DOE)

Visa - Try for USC/GC/EAD for FTE as prefered

RESPONSIBILITIES INCLUDE:

 
      Executing all phases of a PLM projects like implementation, upgrade and migration including

        assessment of current state, development, and other facets of the PLM project lifecycle.

        Developing and customization of PTC Windchill application to support various projects like

        implementation, upgrade, and others along with offshore developers. Below modules,

        functionalities and experience need to be covered from experience standpoint:

1.      PDMLink, ProjectLink, QMS, MPMLink

2.      Thingworx customization

3.      Windchill Business Administration

4.      Windchill Functional Knowledge

        Strong with windchill customization and development and having atleast done architecting for a

        period of 1-2 years.

        Actively participating in all stages of the project from development through testing by

        supporting the rest of the project team

        Developing relationships with the clients and supporting them during the project lifecycle

        Meeting with and utilizing teammates expertise to maximize project efficiency

        Attention to details and documenting methods and lessons learned

EXPERIENCE AND REQUIRED SKILLS:

Prior experience with Semi-Conductor industry is preferable

Ability to code according industry standards for the PTC Windchill application stack and work in

agile environments, manage offshore teams and drive to achieving milestone goals.

Needs to work in US PST time zone.

Good Communication and Interpersonal skills are preferred

Some travel required

Bachelor's degree, preferably in Computer Science and equivalent experience in the development

industry

Strong Experience with Java/J2EE programming, specifically with Windchill API

Experience with ERP integrations preferred

Web-based development experience (JSP, Javascript, MVC, Spring) preferred

In Depth knowledge of object-oriented methodologies and API development preferred

Database experience (Oracle, SQL Server) is a plus

Hi

Hope you are doing well

We have urgent  requirements  with our Implementation partner for the below requirement

Local Only from MA, NH or RI and must have recent client experience from these locations

In-Person Interview is a must for 2nd Round

12 plus years of experience

Title: Data
 Architect

Location: Everett
, MA/Hybrid

Duration: 6 Months with a possible extension

Client Job Description                    

1.

Overall experience in Data Integration, Data Architecture, Data Modeling, and implementation.

2. Full lifecycle data warehousing experience.

3. In depth hands-on experience designing, implementing, and troubleshooting ETL/ELT processes.

4. Knowledge of relational database designs and concepts including normalization and dimensional models.

5. Knowledge of state longitudinal data systems and preparing them for technological advancements and features.

6. Experience with Apache Airflow and creating DAGs to move data from source to target.

7. Experience with Python coding.

8. Develop system/integration test plans and scenarios for data loads and extracts.

9. Manage and monitor production and non-production ETL sessions.

10. Ability to coordinate and work with network and other system administration teams to isolate resource bottlenecks (bandwidth/network, CPU, memory, disk/storage).

11. Good knowledge of applicable data privacy practices and laws (FERPA).

12. Knowledge of different standards in the Education domain CEDS, ED-FI, SIF.

13. Familiarity with student academic performance data and education data concepts.

14. BI knowledge: 3-5 years in BI from ETL perspective.

15. Ability to observe steps and ascertain success of step or failure and root cause of step(s).

16. Direct experience in implementing enterprise data management processes, procedures, and decision support.

17. Experience in Dimensional Modeling.

18. Experience with Snowflake a plus.

19. Experience with modern data management in the cloud, preferably AWS (e.g., RDS, Dynamo, S3).

Need candidate migrating the largest on-premise Teradata data warehouse to GCP Big Query

GCP Data Engineer

Dallas TX

   Ensures seamless integration of data from different sources, such as databases, application programming interfaces (APIs), or streaming platforms.

   Optimizes data processing and query performance by fine-tuning data pipelines, database configurations, and data partitioning strategies.

   Establishes data quality checks and validations to identify and resolve data issues, ensuring high-quality and reliable data for downstream applications and analytics.

   Implements security measures to protect sensitive data throughout the data lifecycle by working closely with security teams to ensure data encryption, access controls, and compliance with data protection regulations.

   Collaborates with cross-functional teams, including data scientists, analysts, software engineers, and business stakeholders.

   Designs and develops data infrastructure, including data warehouses, data lakes, and data pipelines.

   Establishes auditing and monitoring mechanisms to track data access and maintain data governance standards.

   Establishes monitoring and alerting mechanisms to identify bottlenecks, latency, or failures, and troubleshoots data-related problems, investigates root causes, and implements remediation measures.

   Explores new frameworks, platforms, or cloud services that can enhance data processing capabilities.

   Leverages devesecops practices

Position Summary:

   Understands the Enterprise data systems and acquires knowledge on the relevant processes need for project delivery.

   Participate in project estimation process and provide inputs to Tech Lead.

   Participate in Agile scrum activities/project status meetings on regular basis.

   Participate in User story grooming/Design discussion with technical lead.

   Analyzes complex Data structure from disparate data sources and design large scale data engineering pipeline.

   Uses strong programming skills to build robust data pipelines for ETL (Extract / Transform / Load) processes, designs database systems and develops tools for data processing.

   Perform all Data Engineering job activities EDW/ETL project development/testing and deployment activities.

   Work closely with the developers on the ETL Jobs/Pipelines development.

   Create the Project process/automation by integrating the involved components.

   Documents data engineering processes, workflows, and systems for reference and knowledge-sharing purposes.

   Implements data quality checks and validation processes to ensure the accuracy, completeness, and consistency of the data.

   Be a team player and work with team members for Business solution and implementation.

Preferred Qualifications:

  
 GCP Experience - BigQuery, Cloud SQL, Python, Cloud composer/Airflow , Cloud Storage & Dataflow/Data Fusion

   Hands-on experience building and deploying data transformation and processing solutions using Teradata utilities (BTEQ, TPT, FastLoad & SQL Queries).

   GCP - Data Engineer certification strongly preferred.

   Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources.

   Strong problem-solving skills and critical thinking ability

   Strong collaboration and communication skills within and across teams

   Knowledge in Flask, JavaScript, HTML , CSS, Django

   Knowledge in BI Tools MicroStrategy, Tableau

   Must understand software development methodologies including waterfall and agile.

   Health Care/PBM domain experience

   Excellent communication and presentation skills.

Required Qualifications:

   7+ years of Experience in building, executing Data Engineering pipelines.

   6+ years of Experience with Python

   7+ years of Experience with SQL

   7+ years of hands-on Experience with bash shell scripts, UNIX utilities & UNIX Commands

   5+ years of hands-on Experience with a major cloud platform GCP, BiqQuery, Cloud SQL etc.

   5+ experience with various databases Teradata, DB2, Oracles, SQL Server etc.

   Healthcare experience and PBM systems knowledge preferred

Keywords: business intelligence sthree green card Massachusetts New Hampshire Rhode Island Texas
Hiring - Windchill Developer || Certified GCP Data Engineer || || Data Architect
[email protected]
[email protected]
View all
Mon Aug 26 20:07:00 UTC 2024

To remove this job post send "job_kill 1692606" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]
Time Taken: 0

Location: ,