No Of Position: 2
Experience: 8Y to 12Y
Salary: Competitive as per market standards
Notice period: Immediate to 30 days
Skills : Azure, ADLS, Kafka, Apache Delta, Databricks/Spark, Hadoop ecosystem, SQL, RDBMS, Data Lakes and Warehouses
Location: Delhi NCR/Pune/Bangalore/Chennai/Hyderabad/Remote
Role & Responsibilities
- Develop and/or improve scripts supporting Data Warehouse and BI processes in accordance with requirements.
- Provide support for Data Warehouse and BI processes (a combination of Linux shell scripts, Oracle SQL, SSIS packages, and Power BI).
- Conveys needs to well-designed and documented software components based on Data Warehouse technologies by meeting with the onsite team.
- Produces standard documentation and adheres to a standard development procedure.
- In charge of suggesting and assisting with short-term architectural decisions as necessary, as well as those that have longer-term effects on the architecture as a whole.
- In order to help with troubleshooting and ensure adherence to standards, security considerations, and integration considerations, engage with development leaders, network, security, and application owners.
Critical Skills to Have:
- 3+ years of RDBMS architectural experience
- 3–5 years of Oracle SQL/PL-SQL programming experience
- Three to five years of expertise developing with Shell scripting on Unix/Linux
- Strong concepts of the ETL process and data warehousing best practices
- Strong documentation skills
- Experience working in Windows and Linux environments
- Preference for experience with Snowflake
- 3-5 years of Python and/or R scripting experience
- 1-2 years of development experience with Power BI
- Database design and dimensional modeling skills
- Data profiling and data cleaning skills; Statistical Modeling Concepts and automation.
- Exceptionally driven and quick to pick up new skills.
- Effective communication both in writing and orally
- Proficiency in performance tuning and troubleshooting;
- Five or more years of experience in the field of information technology ·
- Has a general understanding of several software platforms and development technologies
- Has experience with SQL, RDBMS, Data Lakes, and Warehouses Knowledge of the Hadoop ecosystem, Azure, ADLS, Kafka, Apache Delta, and Databricks/Spark.
- Possessing knowledge of any data modeling tool, such as ERStudio or Erwin, is advantageous.
- Collaboration history with Product Managers, Technology teams, and Business Partners
- Strong familiarity with Agile and DevOps techniques
- Excellent communication skills both in writing and speaking
Preferred Qualifications:
A bachelor’s degree in business information technology, computer science, or a similar discipline.
Please apply for a resume online, and the digitalxnode evaluation team will reach out to you in case your profile gets screen-selected. We will keep your data in our repository, and our team may reach out to you for other positions.