Application and service software development specifically for creating, on-boarding and consuming data science workloads.

Applications and Services

Once the Machine Learning (ML) or Deep Learning (DL) analysis work produces good results, there is always a requirement to integrate these capabilities into new or existing systems. We have the capability to deploy these algorithms as web services (in various formats) to make application integration easy. Incorporating advanced analytics into existing applications can be made as easy as making a web service call.

When the entire Data Science framework is considered, we offer development services to develop components and services beyond the 'Analyse' phase. For example, there are very useful services such as ingestion pipelines, data pre-processors, storage abstraction and virtualisation and visualisation dashboards and web applications that can be build to empower end users with the best results presented in the best way. These components require a level of knowledge about the Data Science process that some in-house teams don't have yet.

Considering the services mentioned above, we also follow the micro-services architecture design model when designing applications for the cloud, but also for on-premise hosting. Micro-services breaks up the application logic into smaller pieces with specific jobs that we can develop and test separately before deploying to the production environment to ensure maximum system stability. In all these cases we build the automated testing, continuous integration (CI) and continuous deployment (CD) pipeline to achieve production level quality.

Our preference is to develop applications and services using Python, c# .NET Core, HTML5, CSS3 and Angular. We use a varied number of DL engines such as TensorFlow, CNTK with Keras as an interface.

Our cloud platform of choice is Microsoft Azure, but we can support services and containers in other platforms as well. Azure supports useful services that can be used to create a powerful micro-services based solution. Functions, Container Instances, Queue Storage, HDInsight, Cosmos DB and more provide the building blocks needed for secure and resilient applications and services.

User Experience (UX) and User Interface (UI)

There are powerful visualisation tools available to display the results of analytics to ensure decisions by seeing the trends, clusters or classifications. When these results are wrapped in poorly designed applications a large part of the message can be lost due to confusing interface or poor functional implementation. This is where User Experience and User Interface skills and design make a big difference.

UI design will focus on presenting not only the data and insights in a pleasing way, but also skin the application in a graphic design. There are design options listed below that significantly change how the human eye sees what is most important:

  • application purpose by design
  • colour preference
  • logical grouping
  • size of information containers

UX design on the other hand, takes the UI design and considers the functional interactions users will have with the application. Even great looking applications can be a pain to use, because they might take too long to load properly, don't show progress to the user when long running tasks are busy in the background or have strange keyboard, mouse or touch combinations. These easily lead to a frustrating user experience that impacts the usefulness of the application, even if it is perfectly functional otherwise.

We have very specific skills to ensure that beauty and delight is part of the user experience. Our team has graphic design, interface layout, functional design and eye tracking skills that are all used to build good applications and then have actual users test the interactions to ensure we meet the actual demand.

Data Science Workloads

The demands on traditional systems and processes can change significantly when Data Science is introduced alongside legacy applications. Data Science workloads tend to have very different requirements compared to how private data centres and networks were designed a couple of years ago. The changes can be grouped as follows:

  • large scale storage requirements due to the demands of big data for capacity as well as speed of delivery
  • required processing and memory for the training of machine learning (ML) and deep learning (DL) models
  • hosting the trained models so that secure and speedy connections can be made by legacy applications for integration

These requirements often require the use of at least a partial cloud migration and usage to solve. For example, the training process of a DL model require significant processing and memory to be available, in close proximity to the data. Most data centres were never planned to have that much additional processing as space capacity. Cloud on the other hand can provide such requirements in an agile way - which means you only pay for it when it is in use, and it is decommissioned after use.

We specialise in the architecture, construction and development of solutions that bring the current infrastructure our clients have and the cloud abilities together as a single logical service or application. Such solutions can span multiple sites but deliver flexibility and scalability as required.