So You are Building an AI Assistant?

So you are building an AI assistant for the business?

This is a popular topic in the companies these days. Everybody seems to be doing that.

While running AI Research in the last months, I have discovered that many companies in the USA and Europe are building some sort of AI assistants these days, mostly around enterprise workflow automation and knowledge bases.

There are common patterns in how such projects work most of the time. So let me tell you a story. This could help us to avoid possible known pitfalls, save some time and ship the product faster.
 

Building an AI assistant

AI assistant project at a company usually starts with an idea: “Let’s build a business assistant or copilot to solve a problem X”. Problems are usually the ones that require:

  • Answering questions based on the corporate knowledge bases

  • Understanding tables in business reports

  • Being able to run marketing research on the Internet

  • Working locally with sensitive data (e.g. banking, insurance or anything that goes into the data rooms)

Based on the internet research, engineering teams decide that current AI technology is mature enough. Surely, there are enough good success stories.

They pick a tool like:

  • LangChain - “Applications that can reason. Powered by LangChain… From startups to global enterprises, ambitious builders choose LangChain.”

  • LlamaIndex - “Turn your enterprise data into production-ready LLM applications”

  • QDrant - “powering thousands of innovative AI solutions at leading companies”

There is even an official NVIDIA project Chat with RTX that lets us upload our documents and start chatting with an assistant.

These are fun times that go by fast. Teams pick their tools and start working steadily, exchanging new terms and discoveries: vector databases, agents, text chunking and computing embeddings, function calling, tools and Chains of Thought.

First results of the assistant are really good - it is capable of providing knowledgeable and insightful answers in most of the cases. This proves that the prototype is good enough to go for a proper development and deployment.

A few months down the road, the system is deployed. It still works good in most of the cases. But that’s where people start realising that something is not quite right.

You see, the system sometimes also hallucinates. It used to hallucinate a bit during the testing, but after the deployment to the real users, things get a bit more worse.

You see, more specific the question is, higher is the chance that the answer be a nonsense. For some reason the specific questions were not asked before, not until the end users started playing with the system.

End users will be asking very specific questions about their daily jobs, questions they already know answer for. Some people would be even willing to go at great lengths to prove that AI doesn’t work at all. Fear of loosing jobs to AI has always been an important factor.

Hallucinations and mistakes in AI are never a good thing. What is the point of using a system that you can’t trust?

Fortunately, public resources provide many ways to improve the accuracy of a mis-behaving AI assistant.

  • Better prompting

  • Tweaking retrieval and fine-tuning embeddings

  • Query transformations (expansion)

  • Chunk reranking

  • Adding multiple agents and building routers

In the end architecture of an AI assistant could grow as complex as the one below. Yet, the it still will not be able to always answer simple questions. It will lack the accuracy and hallucinate from time to time.

 

IMAGE FROM: HTTPS://TOWARDSDATASCIENCE.COM/12-RAG-PAIN-POINTS-AND-PROPOSED-SOLUTIONS-43709939A28C

So what should the teams do at this point? So much effort already invested, and the accuracy is almost there. We probably just need to fine-tune a proper LLM and everything will be finally working.

At this point, I’m not willing to continue going further into this rabbit hole, even in telling a story. There ending is not happy there.

Been there, done that. Also have seen many companies follow this path.

Fortunately, this is just a story. So we can ask the question “What went wrong?”, then rewind the time a few months back and do things differently.

 

What went wrong and at which point?

Things went wrong at the very start. There are 3 assumptions that are commonly misjudged.

  • Public materials and articles tell the whole truth about how state-of-the-art AI systems actually work in business.

  • If we shred our documents into small chunks, then put them into a vector database, then AI will be able to magically make sense out of that.

  • We need to build an omnipotent AI assistant with complex architecture, in order to bring value to a business.

In my experience, all of these are wrong. The assumptions may be useful, if we want to mislead our competitition into going into a deep rabbit hole. But ultimately they are wrong.

We can significantly reduce the risk by testing and verifying these assumptions before committing to a bit project. Also by:

  • talking to the end users first;

  • asking questions and gathering data beforehand;

  • designing a system that is built to capture feedback and learn even after the first deployment.
     

Story with a happy ending

Here is one common and repeated path to a successfully deployed AI assistant:

  1. Start by asking the questions about the business. Identify process steps that could potentially be automated, based on the existing success stories. Prioritise cases that represent the “lowest hanging fruit”

  2. Talk to the process stakeholders, gain insights about how things really work. Look for the specific repeatable patterns. ChatGPT/LLMs are good at working with patterns.

  3. Establish a quality metric, before writing a single line of code. For example, come up with a list of 40 questions that the system should be able to answer correctly. Identify these correct answers beforehand. This will be our first evaluation dataset.

  4. Give a list of 20 questions with answers to the developers and ask them to build a prototype. Don’t ask for a fancy UI at this point, this will be just a waste of time. This can be a Jupyter notebook with Gradio interface that the team can show over a screen share.

  5. Ideally, developers will just cluster the questions together into categories. Then they will figure out a pipeline that prepares documents in such a way, that a ChatGPT prompt for this question category will be able to produce a correct answer.

  6. Once developers are satisfied with the quality, test the system prototype on the remaining 20 questions. This will be a wake-up call for many. Count the number of mistakes.

  7. Now give developers a little bit of time and let them adjust the system so that it can handle these 20 questions. Measure the time it took to adjust the system.

  8. Give developers 20 more questions they haven’t seen. Measure the time it takes them to fix them now, also count the number of mistakes after the iteration.

Repeat the last step a couple of times, still without investing into the complex UI and user experience (since these will just slow us down).

At this point we should have a trend: with each new iteration, the number of correctly answered questions increases, while the time it takes to make the system “learn” about that - stays the same or even decreases.

If numbers don’t follow the trend (a frequent case with RAG systems based on vector databases), then it is time to close the prototype.

If the trend persists - then we can continue it into the future - build a better user experience, deploy it to the end users, but tell the users this:
 

The system is designed to be learning over the time. So it is very important for us if you would rate answers of the system, when they are correct or incorrect. We will incorporate feedback back into the underlying architecture in a semi-automatic way. This way with each passing iteration the system will understand the specifics of your business much better.

 

Under the hood, your teams will continue following the same pattern they have done in the past:

  1. Take new rated questions and add them to the evaluation dataset.

  2. Figure out why doesn’t the AI Assistant handle the questions properly. Restructure prompts and data preparation until overall quality increases.

  3. Test new versions against the entire evaluation dataset. If things look good, deploy it to the end users to gather more data and edge cases.

  4. Repeat.

Sometime within this process, the teams might gather so much feedback, that they can experiment with fine-tuning models or even training ones. At this point, this will just be a controlled experiment - “Can we make the AI assistant better by training a custom model?”

If the numbers tell otherwise, we can get back to the original old-school process that actually works.
 

Summary

There is a lot of hype around AI and AI assistants these days. This hype is partially fuelled by the companies that sell shovels in this gold rush. All solution providers are partially incentivised to keep around problems to which they provide solutions.

This situation can waste time and energy.

You can save time and sometimes even ship products with AI/GPT faster, if you:

  • Treat everything with a grain of salt, including this blog post.

  • Learn about the mistakes and successes of the others before starting projects.

  • Start by gathering evaluation datasets from the end-users of the actual AI-systems.

  • Use customer feedback to measure quality of AI system from day one. Use it to drive the development.

  • Abort and reconsider the project, if the quality doesn’t improve over the time.

 

Image examples, so we get an idea how we should design the artwork for the blog post:

You can search for images on Google and copy them here, or add a self made sketch. We will then take it and create new shiny versions from it.

Rinat AbdullinRinat AbdullinBlog
Blog

Let's build an Enterprise AI Assistant

In the previous blog post we have talked about basic principles of building AI assistants. Let’s take them for a spin with a product case that we’ve worked on: using AI to support enterprise sales pipelines.

Christoph HasenzaglChristoph HasenzaglBlog
Blog

Common Mistakes in the Development of AI Assistants

How fortunate that people make mistakes: because we can learn from them and improve. We have closely observed how companies around the world have implemented AI assistants in recent months and have, unfortunately, often seen them fail. We would like to share with you how these failures occurred and what can be learned from them for future projects: So that AI assistants can be implemented more successfully in the future!

Jörg EgretzbergerJörg EgretzbergerBlog
Blog

8 tips for developing AI assistants

AI assistants for businesses are hype, and many teams were already eagerly and enthusiastically working on their implementation. Unfortunately, however, we have seen that many teams we have observed in Europe and the US have failed at the task. Read about our 8 most valuable tips, so that you will succeed.

Rinat AbdullinRinat AbdullinBlog
Blog

Trustbit LLM Leaderboard

To address common questions concerning the integration of Large Language Models, we have created an LLM Product Leaderboard that focuses on building and shipping products.

Rinat AbdullinRinat AbdullinBlog
Blog

LLM Performance Series: Batching

Beginning with the September Trustbit LLM Benchmarks, we are now giving particular focus to a range of enterprise workloads. These encompass the kinds of tasks associated with Large Language Models that are frequently encountered in the context of large-scale business digitalization.

Aqeel AlazreeBlog
Blog

Part 1: Data Analysis with ChatGPT

In this new blog series we will give you an overview of how to analyze and visualize data, create code manually and how to make ChatGPT work effectively. Part 1 deals with the following: In the data-driven era, businesses and organizations are constantly seeking ways to extract meaningful insights from their data. One powerful tool that can facilitate this process is ChatGPT, a state-of-the-art natural language processing model developed by OpenAI. In Part 1 pf this blog, we'll explore the proper usage of data analysis with ChatGPT and how it can help you make the most of your data.

Rinat AbdullinRinat AbdullinBlog
Blog

The Intersection of AI and Voice Manipulation

The advent of Artificial Intelligence (AI) in text-to-speech (TTS) technologies has revolutionized the way we interact with written content. Natural Readers, standing at the forefront of this innovation, offers a comprehensive suite of features designed to cater to a broad spectrum of needs, from personal leisure to educational support and commercial use. As we delve into the capabilities of Natural Readers, it's crucial to explore both the advantages it brings to the table and the ethical considerations surrounding voice manipulation in TTS technologies.

Rinat AbdullinRinat AbdullinBlog
Blog

5 Inconvenient Questions when hiring an AI company

This article discusses five questions you should ask when buying an AI. These questions are inconvenient for providers of AI products, but they are necessary to ensure that you are getting the best product for your needs. The article also discusses the importance of testing the AI system on your own data to see how it performs.

TIMETOACT
Technologie
Headerbild zu IBM Watson Assistant
Technologie

IBM Watson Assistant

Watson Assistant identifies intention in requests that can be received via multiple channels. Watson Assistant is trained based on real-live requests and can understand the context and intent of the query based on the acting AI. Extensive search queries are routed to Watson Discovery and seamlessly embedded into the search result.

Rinat AbdullinRinat AbdullinBlog
Blog

Strategic Impact of Large Language Models

This blog discusses the rapid advancements in large language models, particularly highlighting the impact of OpenAI's GPT models.

Matus ZilinskyBlog
Blog

Creating a Social Media Posts Generator Website with ChatGPT

Using the GPT-3-turbo and DALL-E models in Node.js to create a social post generator for a fictional product can be really helpful. The author uses ChatGPT to create an API that utilizes the openai library for Node.js., a Vue component with an input for the title and message of the post. This article provides step-by-step instructions for setting up the project and includes links to the code repository.

Aqeel AlazreeBlog
Blog

Part 4: Save Time and Analyze the Database File

ChatGPT-4 enables you to analyze database contents with just two simple steps (copy and paste), facilitating well-informed decision-making.

Aqeel AlazreeBlog
Blog

Part 3: How to Analyze a Database File with GPT-3.5

In this blog, we'll explore the proper usage of data analysis with ChatGPT and how you can analyze and visualize data from a SQLite database to help you make the most of your data.

TIMETOACT GROUP
Service
Navigationsbild zu Data Science
Service

AI & Data Science

The amount of data that companies produce and process every day is constantly growing. This data contains valuable information about customers, markets, business processes and much more. But how can companies use this data effectively to make better decisions, improve their products and services and tap into new business opportunities?

TIMETOACT GROUP
Service
Headerbild zur AI Factory for Insurance
Service

AI Factory for Insurance

The AI Factory for Insurance is an innovative organisational model combined with a flexible, modular IT architecture. It is an innovation and implementation factory to systematically develop, train and deploy AI models in digital business processes.

Nina DemuthBlog
Blog

Trustbit ML Lab Welcomes Grayskull e150 by Tenstorrent

Discover how Trustbit ML Lab integrates Tenstorrent's Grayskull e150, led by Jim Keller, for cutting-edge, energy-efficient AI processing.

Sebastian BelczykBlog
Blog

Building and Publishing Design Systems | Part 2

Learn how to build and publish design systems effectively. Discover best practices for creating reusable components and enhancing UI consistency.

Sebastian BelczykBlog
Blog

Building a micro frontend consuming a design system | Part 3

In this blopgpost, you will learn how to create a react application that consumes a design system.

Sebastian BelczykBlog
Blog

Building A Shell Application for Micro Frontends | Part 4

We already have a design system, several micro frontends consuming this design system, and now we need a shell application that imports micro frontends and displays them.

TIMETOACT
Service
Headerbild zu Operationalisierung von Data Science (MLOps)
Service

Operationalization of Data Science (MLOps)

Data and Artificial Intelligence (AI) can support almost any business process based on facts. Many companies are in the phase of professional assessment of the algorithms and technical testing of the respective technologies.

Kompetenz
Headerbild GenAI Consulting
Kompetenz

GenAI Consulting

ChatGPT, Bard & Co. have shown at the latest: Generative AI has the potential to revolutionize the world of work. With GenAI Consulting, we support you in exploiting this potential for your company.

Kompetenz
Kompetenz

Graph Technology

We help you harness the power of graphs to transform your business. Our expertise spans from graph database modelling and graph data science to generative AI.

Felix KrauseBlog
Blog

License Plate Detection for Precise Car Distance Estimation

When it comes to advanced driver-assistance systems or self-driving cars, one needs to find a way of estimating the distance to other vehicles on the road.

Aqeel AlazreeBlog
Blog

Database Analysis Report

This report comprehensively analyzes the auto parts sales database. The primary focus is understanding sales trends, identifying high-performing products, Analyzing the most profitable products for the upcoming quarter, and evaluating inventory management efficiency.

TIMETOACT
Technologie
Headerbild zu Microsoft SQL Server
Technologie

Microsoft SQL Server

SQL Server 2019 offers companies recognized good and extensive functions for building an analytical solution. Both data integration, storage, analysis and reporting can be realized, and through the tight integration of PowerBI, extensive visualizations can be created and data can be given to consumers.

Laura GaetanoBlog
Blog

5 lessons from running a (remote) design systems book club

Last year I gifted a design systems book I had been reading to a friend and she suggested starting a mini book club so that she’d have some accountability to finish reading the book. I took her up on the offer and so in late spring, our design systems book club was born. But how can you make the meetings fun and engaging even though you're physically separated? Here are a couple of things I learned from running my very first remote book club with my friend!

TIMETOACT
Technologie
Headerbild zu Microsoft Azure
Technologie

Microsoft Azure

Azure is the cloud offering from Microsoft. Numerous services are provided in Azure, not only for analytical requirements. Particularly worth mentioning from an analytical perspective are services for data storage (relational, NoSQL and in-memory / with Microsoft or OpenSource technology), Azure Data Factory for data integration, numerous services including AI and, of course, services for BI, such as Power BI or Analysis Services.

TIMETOACT
Service
Teaserbild zu Data Integration Service und Consulting
Service

Data Integration, ETL and Data Virtualization

While the term "ETL" (Extract - Transform - Load / or ELT) usually described the classic batch-driven process, today the term "Data Integration" extends to all methods of integration: whether batch, real-time, inside or outside a database, or between any systems.

TIMETOACT
Technologie
Headerbild zu IBM Watson Knowledge Studio
Technologie

IBM Watson Knowledge Studio

In IBM Watson Knowledge Studio, you train an Artificial Intelligence (AI) on specialist terms of your company or specialist area ("domain knowledge"). In this way, you lay the foundation for automated text processing of extensive, subject-related documents.

Service
Service

Application Integration & Process Automation

Digitizing and improving business processes and responding agilely to change – more and more companies are facing these kind of challenges. This makes it all the more important to take new business opportunities through integrated and optimized processes based on intelligent, digitally networked systems.

TIMETOACT GROUP
Branche
Headerbild für lokale Entwicklerressourcen in Deutschland
Branche

On-site digitization partner for insurance companies

As TIMETOACT GROUP, we are one of the leading digitization partners for IT solutions in Germany, Austria and Switzerland. As your partner, we are there for you at 17 locations and will find the right solution on the path to digitization - gladly together in a personal exchange on site.

Sebastian BelczykBlog
Blog

Composite UI with Design System and Micro Frontends

Discover how to create scalable composite UIs using design systems and micro-frontends. Enhance consistency and agility in your development process.

TIMETOACT GROUP
Service
Articifial Intelligence & Data Science
Service

Artificial Intelligence & Data Science

Data Science is all about extracting valuable information from structured and unstructured data. Together with Artificial Intelligence (AI) – the ability of a machine to imitate intelligent human behavior – you can make accurate decisions, based on high-quality information. Moreover, you can react quickly to recent developments.

Daniel PuchnerBlog
Blog

How we discover and organise domains in an existing product

Software companies and consultants like to flex their Domain Driven Design (DDD) muscles by throwing around terms like Domain, Subdomain and Bounded Context. But what lies behind these buzzwords, and how these apply to customers' diverse environments and needs, are often not as clear. As it turns out it takes a collaborative effort between stakeholders and development team(s) over a longer period of time on a regular basis to get them right.

TIMETOACT
Technologie
Headerbild zu IBM Watson® Knowledge Catalog
Technologie

IBM Watson® Knowledge Catalog/Information Governance Catalog

Today, "IGC" is a proprietary enterprise cataloging and metadata management solution that is the foundation of all an organization's efforts to comply with rules and regulations or document analytical assets.

Kompetenz
Data Science & Advanced Analytics
Kompetenz

Data Science, AI & Advanced Analytics

Data Science & Advanced Analytics includes a wide range of tools that can examine business processes, help drive change and improvement.

Daniel WellerBlog
Blog

Revolutionizing the Logistics Industry

As the logistics industry becomes increasingly complex, businesses need innovative solutions to manage the challenges of supply chain management, trucking, and delivery. With competitors investing in cutting-edge research and development, it is vital for companies to stay ahead of the curve and embrace the latest technologies to remain competitive. That is why we introduce the TIMETOACT Logistics Simulator Framework, a revolutionary tool for creating a digital twin of your logistics operation.

Aqeel AlazreeBlog
Blog

Part 2: Data Analysis with powerful Python

Analyzing and visualizing data from a SQLite database in Python can be a powerful way to gain insights and present your findings. In Part 2 of this blog series, we will walk you through the steps to retrieve data from a SQLite database file named gold.db and display it in the form of a chart using Python. We'll use some essential tools and libraries for this task.

TIMETOACT GROUP
Kompetenz
Kompetenz

Artificial Intelligence & Data Strategy

Every company collects and manages vast amounts of data, e.g. from production processes or business transactions. However, only a fraction of this data is used effectively to support control and decision-making processes.

Christian FolieBlog
Blog

The Power of Event Sourcing

This is how we used Event Sourcing to maintain flexibility, handle changes, and ensure efficient error resolution in application development.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 5

Master F# asynchronous workflows and parallelism. Enhance application performance with advanced functional programming techniques.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F#

Dive into functional programming with F# in our introductory series. Learn how to solve real business problems using F#'s functional programming features. This first part covers setting up your environment, basic F# syntax, and implementing a simple use case. Perfect for developers looking to enhance their skills in functional programming.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 4

Unlock F# collections and pipelines. Manage data efficiently and streamline your functional programming workflow with these powerful tools.

Rinat AbdullinRinat AbdullinBlog
Blog

Innovation Incubator at TIMETOACT GROUP Austria

Discover how our Innovation Incubator empowers teams to innovate with collaborative, week-long experiments, driving company-wide creativity and progress.

TIMETOACT
Service
Navigationsbild zu Data Science
Service

Data Science, Artificial Intelligence and Machine Learning

For some time, Data Science has been considered the supreme discipline in the recognition of valuable information in large amounts of data. It promises to extract hidden, valuable information from data of any structure.

Felix KrauseBlog
Blog

Part 1: Detecting Truck Parking Lots on Satellite Images

Real-time truck tracking is crucial in logistics: to enable accurate planning and provide reliable estimation of delivery times, operators build detailed profiles of loading stations, providing expected durations of truck loading and unloading, as well as resting times. Yet, how to derive an exact truck status based on mere GPS signals?

Laura GaetanoBlog
Blog

My Weekly Shutdown Routine

Discover my weekly shutdown routine to enhance productivity and start each week fresh. Learn effective techniques for reflection and organization.

Jonathan ChannonBlog
Blog

Tracing IO in .NET Core

Learn how we leverage OpenTelemetry for efficient tracing of IO operations in .NET Core applications, enhancing performance and monitoring.

TIMETOACT
Technologie
Headerbild zu IBM Cloud Pak for Data Accelerator
Technologie

IBM Cloud Pak for Data Accelerator

For a quick start in certain use cases, specifically for certain business areas or industries, IBM offers so-called accelerators based on the "Cloud Pak for Data" solution, which serve as a template for project development and can thus significantly accelerate the implementation of these use cases. The platform itself provides all the necessary functions for all types of analytics projects, and the accelerators provide the respective content.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 2

Explore functions, types, and modules in F#. Enhance your skills with practical examples and insights in this detailed guide.

Daniel PuchnerBlog
Blog

Make Your Value Stream Visible Through Structured Logging

Boost your value stream visibility with structured logging. Improve traceability and streamline processes in your software development lifecycle.

TIMETOACT
Technologie
Technologie

Microsoft Azure Synapse Analytics

With Synapse, Microsoft has provided a platform for all aspects of analytics in the Azure Cloud. Within the platform, Synapse includes services for data integration, data storage of any size and big data analytics. Together with existing architecture templates, a solution for every analytical use case is created in a short time.

TIMETOACT
Service
Headerbild zu Data Governance Consulting
Service

Data Governance

Data Governance describes all processes that aim to ensure the traceability, quality and protection of data. The need for documentation and traceability increases exponentially as more and more data from different sources is used for decision-making and as a result of the technical possibilities of integration in Data Warehouses or Data Lakes.

Christian FolieBlog
Blog

Running Hybrid Workshops

When modernizing or building systems, one major challenge is finding out what to build. In Pre-Covid times on-site workshops were a main source to get an idea about ‘the right thing’. But during Covid everybody got used to working remotely, so now the question can be raised: Is it still worth having on-site, physical workshops?

TIMETOACT
Technologie
Headerbild zu Cloud Pak for Data – Test-Drive
Technologie

IBM Cloud Pak for Data – Test-Drive

By making our comprehensive demo and customer data platform available, we want to offer these customers a way to get a very quick and pragmatic impression of the technology with their data.

TIMETOACT
Technologie
Headerbild zu IBM DataStage
Technologie

IBM InfoSphere Information Server

IBM Information Server is a central platform for enterprise-wide information integration. With IBM Information Server, business information can be extracted, consolidated and merged from a wide variety of sources.

TIMETOACT
Martin LangeMartin LangeBlog
Checkliste als Symbol für die verschiedenen To Dos im Bereich Lizenzmanagement
Blog

License Management – Everything you need to know

License management is not only relevant in terms of compliance but can also minimize costs and risks. Read more in the article.

TIMETOACT GROUP
Branche
Branche

Insurance

Insurance companies live by making a promise to people - and that promise is security. Crucial to success is not only the mastery of new technologies and new forms of collaboration, but above all a change in corporate culture.

TIMETOACT
Technologie
Headerbild IBM Cloud Pak for Data
Technologie

IBM Cloud Pak for Data

The Cloud Pak for Data acts as a central, modular platform for analytical use cases. It integrates functions for the physical and virtual integration of data into a central data pool - a data lake or a data warehouse, a comprehensive data catalogue and numerous possibilities for (AI) analysis up to the operational use of the same.

TIMETOACT
Service
Headerbild zu Dashboards und Reports
Service

Dashboards & Reports

The discipline of Business Intelligence provides the necessary means for accessing data. In addition, various methods have developed that help to transport information to the end user through various technologies.

TIMETOACT
Service
Navigationsbild zu Business Intelligence
Service

Business Intelligence

Business Intelligence (BI) is a technology-driven process for analyzing data and presenting usable information. On this basis, sound decisions can be made.

TIMETOACT
Technologie
Headerbild zu IBM Decision Optimization
Technologie

Decision Optimization

Mathematical algorithms enable fast and efficient improvement of partially contradictory specifications. As an integral part of the IBM Data Science platform "Cloud Pak for Data" or "IBM Watson Studio", decision optimisation has been decisively expanded and embedded in the Data Science process.

Referenz
Referenz

Customer Relationship Management with Jira and Confluence

The TOPMOTIVE Group consists of Europe's leading companies for catalog and information systems and services in the automotive independent aftermarket. Creating, maintaining and displaying sales-relevant information in one system was the vision that became reality with the help of Atlassian tools.

TIMETOACT
Technologie
Headerbild IBM Cloud Pak for Data System
Technologie

IBM Cloud Pak for Data System

With the Cloud Pak for Data System (CP4DS), IBM provides the optimal hardware for the use of all Cloud Pak for Data functions industry-wide and thus continues the series of ready-configured systems ("Appliance" or "Hyperconverged System").

TIMETOACT
Technologie
Headerbild für IBM SPSS
Technologie

IBM SPSS Modeler

IBM SPSS Modeler is a tool that can be used to model and execute tasks, for example in the field of Data Science and Data Mining, via a graphical user interface.

TIMETOACT
Technologie
Headerbild zu IBM Watson Discovery
Technologie

IBM Watson Discovery

With Watson Discovery, company data is searched using modern AI to extract information. On the one hand, the AI uses already trained methods to understand texts; on the other hand, it is constantly developed through new training on the company data, its structure and content, thus constantly improving the search results.

TIMETOACT GROUP
Branche
Schild als Symbol für innere und äußere Sicherheit
Branche

Internal and external security

Defense forces and police must protect citizens and the state from ever new threats. Modern IT & software solutions support them in this task.

TIMETOACT
Service
Headerbild zu Digitale Planung, Forecasting und Optimierung
Service

Demand Planning, Forecasting and Optimization

After the data has been prepared and visualized via dashboards and reports, the task is now to use the data obtained accordingly. Digital planning, forecasting and optimization describes all the capabilities of an IT-supported solution in the company to support users in digital analysis and planning.

TIMETOACT GROUP
Service
Headerbild Data Insights
Service

Data Insights

With Data Insights, we help you step by step with the appropriate architecture to use new technologies and develop a data-driven corporate culture: from the development of new data sources, to exploratory analysis to gain new insights, to predictive models.

TIMETOACT
Referenz
Referenz

Standardized data management creates basis for reporting

TIMETOACT implements a higher-level data model in a data warehouse for TRUMPF Photonic Components and provides the necessary data integration connection with Talend. With this standardized data management, TRUMPF will receive reports based on reliable data in the future and can also transfer the model to other departments.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 7

Explore LINQ and query expressions in F#. Simplify data manipulation and enhance your functional programming skills with this guide.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 8

Discover Units of Measure and Type Providers in F#. Enhance data management and type safety in your applications with these powerful tools.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 9

Explore Active Patterns and Computation Expressions in F#. Enhance code clarity and functionality with these advanced techniques.

Laura GaetanoBlog
Blog

Using a Skill/Will matrix for personal career development

Discover how a Skill/Will Matrix helps employees identify strengths and areas for growth, boosting personal and professional development.

Rinat AbdullinRinat AbdullinBlog
Blog

Innovation Incubator Round 1

Team experiments with new technologies and collaborative problem-solving: This was our first round of the Innovation Incubator.

Rinat AbdullinRinat AbdullinBlog
Blog

Announcing Domain-Driven Design Exercises

Interested in Domain Dirven Design? Then this DDD exercise is perfect for you!

Rinat AbdullinRinat AbdullinBlog
Blog

Using NLP libraries for post-processing

Learn how to analyse sticky notes in miro from event stormings and how this analysis can be carried out with the help of the spaCy library.

Bernhard SchauerBlog
Blog

ADRs as a Tool to Build Empowered Teams

Learn how we use Architecture Decision Records (ADRs) to build empowered, autonomous teams, enhancing decision-making and collaboration.

Christian FolieBlog
Blog

Designing and Running a Workshop series: An outline

Learn how to design and execute impactful workshops. Discover tips, strategies, and a step-by-step outline for a successful workshop series.

Christian FolieBlog
Blog

Designing and Running a Workshop series: The board

In this part, we discuss the basic design of the Miro board, which will aid in conducting the workshops.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 6

Learn error handling in F# with option types. Improve code reliability using F#'s powerful error-handling techniques.

Ian RusselIan RusselBlog
Blog

So, I wrote a book

Join me as I share the story of writing a book on F#. Discover the challenges, insights, and triumphs along the way.

Nina DemuthBlog
Blog

7 Positive effects of visualizing the interests of your team

Interests maps unleash hidden potentials and interests, but they also make it clear which topics are not of interest to your colleagues.

Daniel PuchnerBlog
Blog

How to gather data from Miro

Learn how to gather data from Miro boards with this step-by-step guide. Streamline your data collection for deeper insights.

Blog
Blog

My Workflows During the Quarantine

The current situation has deeply affected our daily lives. However, in retrospect, it had a surprisingly small impact on how we get work done at TIMETOACT GROUP Austria.

Rinat AbdullinRinat AbdullinBlog
Blog

Learning + Sharing at TIMETOACT GROUP Austria

Discover how we fosters continuous learning and sharing among employees, encouraging growth and collaboration through dedicated time for skill development.