StaffAttract
  • Login
  • Create Account
  • Products
    • Private Ad Placement
    • Reports Management
    • Publisher Monetization
    • Search Jobs
  • About Us
  • Contact Us
  • Unsubscribe

Login

Forgot Password?

Create Account

Job title, industry, keywords, etc.
City, State or Postcode

Fabric Developer

SysMind Tech - Minneapolis, MN

Apply Now

Job Description

Role: MS Fabric Developer Location: Minneapolish, MN (Onsite) Job Type: ContractJD: Fabric Developer Microsoft Fabric, Python, PySpark, ETL Process, Data Integration, Azure "Job Responsiblities • Responsible for understanding the requirements and perform data analysis. • Responsible for setup of Microsoft fabric and its components • Building secure, scalable solutions across the Microsoft Fabric platform. • Create and manage Lakehouse. • Implement Data Factory processes for data ingestion, scalable ETL and data integration. • Design, implement and manage comprehensive warehousing solutions for analytics using fabric • Creating and scheduling data pipelines using Azure data factory • Building robust data solutions using Microsoft data engineering tools like Notebook, Lakehouse and spark application. • Build and automate deployment pipelines using CICD tools for the release of fabric content from lower to higher environments. • Set and use Git as a repository and versioning of fabric components • Create and manage Power BI reports and semantic models • Write and optimize complex SQL queries to extract and analyze data, ensuring data processing and accurate reporting. Job Qualifications Mandatory • Bachelor's degree in computer science or similar field or equivalent work experience. • 3+ years of experience working in Microsoft Fabric. • Expertise in working with OneLake, Lakehouse, Warehouse and Notebook • Strong understanding of Power BI reports and semantic model using Fabric • Proven record of building ETL and data solutions using Azure data factory. • Strong understanding of data warehousing concepts and ETL processes. • Hand on experience of building data warehouses in fabric. • Strong skills in Python and PySpark • Practical experience of implementing spark in fabric, scheduling spark jobs, writing spark SQL queries. • Experience of utilizing Data Activator for effective data asset management and analytics. • Ability to flex and adapt to different tools and technologies. • Strong learning attitude. • Good written and verbal communication skills. • Demonstrated experience of working in a team spread across multiple locations.

Created: 2026-04-02

➤
Footer Logo
Privacy Policy | Terms & Conditions | Contact Us | About Us
Designed, Developed and Maintained by: NextGen TechEdge Solutions Pvt. Ltd.