Careers

BI Developer
Contract Position - Pipeline
Requirements:
-
BSc/BA in Computer Science, Informatics or relevant field
-
5 to 10 years solid BI experience Proven experience as a BI Developer.
-
Industry experience is preferred.
-
Preferably has consulting experience.
-
Background in data warehouse design (e.g. dimensional modelling) and data mining.
-
Has built a Data Warehouse previously.
-
In-depth understanding of database. management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework.
-
Knowledge of SQL queries, SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS) and SSAS.
-
Proven abilities to take initiative and be innovative.
-
Analytical mind with a problem-solving aptitude.
-
Tech required:
-
Back end – SSAS, SSIS, SSRS, SQL, python, Oracle and Microsoft.
-
Front end - Microsoft Power BI (required) or QlikView or Tableau.
-

SQL Developer
Contract Position - Pipeline
Requirements:
-
5+ years development experience.
-
Proficient to Expert with SQL Server Reporting Services.
-
Proficient to Expert with SQL Server T-SQL.
-
Proficient with SQL Server SSIS packages.
-
Proficient to Expert with Unix shell scripting.
-
Experience in SQL Server Analysis services would be an advantage.
-
Proficient with Riskwatch development.
-
Needs to be versatile and able to up skill on any technology if required.
-
Experience designing, developing and supporting software solutions in a corporate environment will be advantageous.

Technical Business Analyst
Contract Position - Pipeline
Requirements:
-
Bachelor's degree in Computer Science, Information Systems, Business Administration, or a related field.
-
Certification in Business Analysis (e.g., CBAP, CCBA, BCS BA Diploma) or Agile (e.g., PSPO, CSM).
-
5-7 years of progressive experience as a Business Analyst, with a significant emphasis on technical projects (e.g., software development, system integrations, data solutions, digital transformation).
-
Proven ability to elicit, analyse, document, and manage requirements using various methodologies (Agile User Stories, Use Cases, Functional Specifications).
-
Strong analytical capabilities to understand complex business problems and translate them into actionable technical solutions.
-
Solid understanding of software development lifecycles (SDLC) and methodologies (Agile, Scrum, Waterfall).
-
Ability to read and understand technical specifications, database schemas, and API documentation.
-
Exceptional verbal and written communication skills, with the ability to bridge the communication gap between technical and non-technical stakeholders. Strong presentation and facilitation skills.
-
Proficiency in creating process flow diagrams (e.g., BPMN, UML activity diagrams).
-
Experience with requirements management tools (e.g., Jira, Azure DevOps, Confluence) and diagramming tools (e.g., Visio, Miro, Lucidchart).
-
Basic SQL querying skills for data analysis and validation.
-
Familiarity with cloud platforms (e.g., Azure, AWS, GCP) and cloud-native application development.
-
Experience with API design and documentation.
-
Knowledge of UX/UI principles and ability to create wireframes or mockups.

DevOps Engineer
Contract Position - Pipeline
Requirements:
-
A bachelor's degree in computer science, or similar.
-
A master's degree in a related field preferred.
-
Expert in programming languages such as Python, Scala and R.
-
Minimum 5 year or more with Azure cloud infrastructure setup and Infrastructure as Code (IAC).
-
Minimum 3-5 year of experience in Azure Cloud DevOps engineer role with Bicep.
-
Experience with various DevOps concepts, tools, and technologies.
-
Deep understanding of testing frameworks and automated pipelining with GIT integration.
-
Understanding and application of Big Data and distributed computing principles (Hadoop and MapReduce).
-
Strong analytical and statistical knowledge with an understanding of the latest machine learning algorithms for both structured and unstructured data.
-
Strong ability to communicate findings and recommendations from data (visual, verbal and written).
-
DevOps/DataOps/MLOps and CI/CD experience.
-
Proficient understanding data manipulation skills including SQL to ETL processes.
-
Experience in interactive data exploration and data-driven story telling.

Power BI Developer
Contract Position - Pipeline
Requirements:
-
Bachelor's or Honours degree in Computer Science, Information Systems, Data Analytics, or a related quantitative field.
-
Proficient in Power BI Desktop (data modelling, relationships, calculated columns/tables).
-
Strong experience with Power BI Service administration, deployment pipelines, dataflows, gateways, and app publishing.
-
Solid understanding of Power BI Premium capacities (understanding features like paginated reports, AI visuals, deployment pipelines).
-
Be experienced in tools and systems on MS SQL Server BI Stack, including SSRS and TSQL, Power Query, MDX, Power BI, and DAX.
-
Familiarity with Python or R for data manipulation or statistical analysis within a BI context.
-
Understand business requirements to set functional specifications for reporting applications.
-
Advanced SQL skills for data extraction, manipulation, and performance tuning (T-SQL preferred).
-
Deep understanding of data warehousing concepts (dimensional modelling, star/snowflake schemas, ETL/ELT principles).

Data Engineer
Contract Position - Pipeline
Requirements:
-
Bachelor's or Honours degree in Computer Science, Data Engineering, Information Systems, or a closely related technical field.
-
5-10 years of progressive, hands-on experience in Data Engineering, with a proven track record of designing and implementing scalable data solutions.
-
Expert in programming languages such as R, Python, Scala and Java.
-
Expert database knowledge in SQL and experience with MS Azure tools such as Data Factory, Synapse Analytics, Data Lake, Databricks, Azure stream analytics and Power BI.
-
Exposure to AI or model development.
-
Experience working on large and complex datasets.
-
Understanding and application of Big Data and distributed computing principles (Hadoop and MapReduce).
-
ML model optimisation skills in a production environment.
-
DevOps/DataOps and CI/CD experience.

Data Modeler
Contract Position - Pipeline
Requirements:
-
Bachelor's or Honours degree in Computer Science, Information Systems, Data Analytics, or a related quantitative field.
-
5-10 years of dedicated experience in data modelling, with a proven track record of designing and implementing complex data models for large-scale data warehousing and analytical solutions.
-
Expert Data Modelling Methodologies: Deep, hands-on expertise in at least two of the following:
-
Dimensional Modelling (Kimball Methodologies): Star schemas, snowflake schemas, facts, dimensions, slowly changing dimensions (SCDs).
-
Data Vault 2.0: Hubs, Links, Satellites, raw vault, business vault.
-
Third Normal Form (3NF) for operational data stores.
-
SQL Mastery: Advanced SQL proficiency for data analysis, profiling, validation, and understanding underlying database structures.
-
-
Experience modelling for Azure-native analytical platforms such as Azure Synapse Analytics (dedicated SQL pools, serverless SQL pools).
-
Understanding of data modelling considerations for Azure Databricks (Delta Lake, Medallion Architecture) and Azure Data Lake Storage Gen2.
-
Familiarity with Azure Analysis Services (AAS) or Power BI Premium semantic models.
-
Data Modelling Tools: Proficiency with professional data modelling tools (e.g., ER/Studio, Power Designer, or similar industry-recognized tools).
-
Attention to Detail: Meticulous attention to detail and a commitment to data accuracy and integrity.

Business Intelligence Business Analyst
Contract Position - Pipeline
Requirements:
-
Bachelor's or Honours degree in Business Analysis, Information Systems, Business Intelligence, Computer Science, or a related field.
-
5-7 years of dedicated experience as a Business Analyst, with a significant focus on Business Intelligence, Data Warehousing, or Data Analytics projects.
-
Proven expertise in requirements elicitation, analysis, documentation (e.g., user stories, functional specifications, data dictionaries), and validation.
-
Solid understanding of data warehousing concepts (dimensional modelling, ETL/ELT processes) and relational database principles.
-
SQL Proficiency: Intermediate to advanced SQL skills for data querying, analysis, and validation.
-
Hands-on experience working with or specifying requirements for BI tools such as Microsoft Power BI (highly preferred), Tableau, Qlik Sense, etc. Understanding of what these tools can deliver.

Generative AI Developer
Contract Position - Pipeline
Requirements:
-
Experience with serverless architectures and containerisation (Docker, Kubernetes).
-
Familiarity with MLOps and AI model monitoring.
-
Knowledge of RAG (Retrieval-Augmented Generation) implementations.
-
Strong experience with LLMs (e.g., OpenAI, Anthropic).
-
Hands-on experience with Python and frameworks like LangChain, Hugging Face, PyTorch, or TensorFlow.
-
Proven experience deploying AI apps on Azure, AWS, or GCP.
-
Experience with cloud AI services, vector databases, and prompt engineering.
-
Strong problem-solving skills and ability to work in an agile environment.
Responsibilities:
-
Design, develop, and optimise LLM-powered apps and AI agents.
-
Implement AI solutions on cloud platforms (Azure, AWS, or GCP).
-
Use LangChain and Python libraries to enhance AI capabilities.
-
Fine-tune and deploy LLMs for specific business needs.
-
Develop and integrate APIs and pipelines for AI applications.
-
Collaborate with teams to create scalable and production ready AI solutions.
-
Ensure performance, reliability and security of AI solutions.

Data Architect
Contract Position - Pipeline
Requirements:
-
Bachelor's or Honours degree in Computer Science, Data Engineering, Information Systems, or a related technical field.
-
10+ years of progressive experience in data management, data engineering, or solution architecture, with at least 5+ years in a dedicated Data Architect or Lead Data Architect role designing complex enterprise data platforms.
-
Expert-level architectural and hands-on experience with a broad range of Azure data services.
-
Proficiency in designing Lakehouse architectures leveraging ADLS Gen2, Databricks (Delta Lake), and Synapse Analytics.
-
Mastery of data warehousing concepts (Kimball, Inmon, Data Vault 2.0) and their practical application in cloud environments.
-
Experience with real-time/streaming data architectures using Azure Event Hubs/Stream Analytics.
-
Strong understanding of data integration patterns and tools (ADF, custom code).
-
Expert in conceptual, logical, and physical data modelling techniques (Dimensional, Data Vault, 3NF).
-
Strong proficiency in SQL, and at least one relevant programming language (e.g., Python, Scala) for architectural validation or proof-of-concept development.
-
Deep knowledge of data governance principles, metadata management, data lineage, and cloud security best practices for data.
-
Proven ability to architect for performance, scalability, and cost optimisation in a cloud environment.
-
Understanding of broader enterprise architecture principles (TOGAF, Zachman) and how data architecture fits into the overall enterprise landscape.
-
Exceptional communication, presentation, and interpersonal skills, with the ability to influence and persuade technical and non-technical audiences, including senior leadership.