Qualifications
The ideal candidate should have exceptional verbal and written communication skills, thrive in a fast-paced and adaptable environment, and possess the initiative to quickly learn new tools. Must Have:Bachelor's degree in Engineering with at least three (3) years of relevant experience, or a Master’s degree in Engineering. In-depth expertise in data warehousing. Expert-level proficiency in ETL development using tools such as Informatica PowerCenter or SAP Data Services. Advanced SQL coding/querying skills in SQL Server and/or MySQL. Experience with Business Intelligence tools including Crystal Reports, Business Objects, Microsoft SSRS, or Oracle OBIEE. Familiarity with Amazon RDS and RedShift, and/or Google BigQuery. Experience in relational database design/modeling with tools like ERstudio or ERwin. Desired:Experience handling unstructured or semi-structured data. Knowledge of data privacy regulations and data governance is a plus.
About the job
The Data Warehouse Developer/Analyst will play a crucial role in enhancing our User Experience data analysis. By developing new Marts that support this analysis, the successful candidate will significantly influence our product features and improvements. This position encompasses the entire Mart lifecycle, including data profiling, design, development, debugging/testing, and ongoing support. Collaboration with the Product Development team and the Data Warehouse team within our IT department will be key.
Responsibilities:
The incumbent will:
- Profile and comprehend extensive source data, including structured and semi-structured/web activity data.
- Collaborate with data originators to address data collection gaps and refine source-system data structures for improved analysis and integration.
- Collect reporting and analysis requirements and translate them into data models for reporting structures, including aggregate tables, pivoted tables, and relational/dimensional (star-schema) marts.
- Employ data mining techniques for source data analysis to determine optimal reporting structures.
- Map sources to target designs using tools like Business Objects Data Services/BODI, and create ETL code to load and transform source data from various formats into a SQL database.
- Conduct performance tuning and troubleshooting for ETL processes, along with capacity estimation.
- Thoroughly test ETL code changes to ensure high-quality data output.
- Provide daily support and mentorship to end users engaging with the data.
About kgtiger2
About Us:We are a prominent $800 million company specializing in mathematical computing software, with over 3,000 employees worldwide. Our portfolio includes more than 90 well-recognized products utilized across various sectors such as automotive, aerospace, defense, biotech, and semiconductor industries. Headquartered in Natick, MA, we also have branches in Bangalore, Delhi, and Pune, India. About KGiSL:KGiSL Group of Companies is part of the KG Group, a $500 million premier industrial group in South India, with over 70 years of diversified focus in textile, engineering, healthcare, finance, IT & ITES services, infrastructure, and education. We are a CMM Level-4 company.