How to gather data from Miro

In this blog post you will learn about ways to automatically gather, modify and analyse Miro boards.

Miro is an online collaborative whiteboard platform which we are using heavily in our consulting workshops to collaborate, explore and document. In this blog post I will show you options to automatically gather, modify and analyse Miro boards.

Why we wanted to gather data from miro

In one of our Innovation Incubator rounds, the idea was born to find ways to analyse Big Picture Event storms and their evolution over time in the course of the workshop’s post processing. Wouldn’t it be helpful if you could create a replay cockpit where you can replay the whole workshop and see how this event storm evolved over time.

Additionally we needed to export board data into our statistical analyzers written in python. Basic data such as stickers and their history needed to generate word clouds, lingual analysis and activity analysis of workshop participants.

So the idea was born to check out the Miro Options to gather data and create ideas how this data can be analysed.

Options to access Miro Data

MIRO provides two ways of accessing data. You can either use the Miro REST API or the Miro Web Plugin SDK. The REST API provides easy access to basic board data but it is not yet comprehensive whereas the Miro SDK provides more capabilities.

Retrieving Data

The REST API is the simplest way to retrieve data from Miro. The following preparatory steps need to be executed to access the API (detailed description can be found here)

1. Create a developer team

2. Create an APP within your organization. Go to your User->Profile Settings->Your Apps

3. Create a developer team

4. Create an APP within your organization: Go to your User->Profile Settings —> Your Apps

Once you have gathered you personal access (bearer) token, the first request can be sent to Miro on behalf of your Miro user. You can access any resource your Miro user has access to, limited by the app permission scope (intersection of both permissions). It is also possible to send requests on behalf other users, however this requires that you set up an OAUTH 2.0 flow. In this Python sample request we retrieve all widgets from a board.

 url = "https://api.miro.com/v1/boards/" + board_id + "/widgets"

    headers = {
        "Accept": "application/json",
        "Content-Type": "application/json",
        "Authorization": bearer
    }

    response = requests.request("GET", url, headers=headers)

This sample response contains JSON describing all the widgets on a board. As you can see there is a lot of styling and positioning information delivered. Everything within a frame is embedded as sub collection within this frame.

{
  "type": "collection",
  "data": [
    {
      "type": "sticker",
      "scale": 0.54,
      "y": -131.5,
      "x": 250.0,
      "text": "<p>t</p>",
      "style": {
        "backgroundColor": "#fff9b1",
        "fontAutoSize": true,
        "fontFamily": "OpenSans",
        "fontSize": 64.0,
        "lineHeight": 1.36,
        "textAlign": "center",
        "textAlignVertical": "middle"
      },
      "id": 3074457345618258749,
      "createdBy": {
        "type": "user",
        "name": "igor",
        "id": "3074457345618258730"
      },
      "modifiedAt": "2019-04-06T14:19:35Z",
      "modifiedBy": {
        "type": "user",
        "name": "igor",
        "id": "3074457345618258730"
      },
      "createdAt": "2019-04-06T14:17:04Z"
    }
  ]
  }

Limits and Observations

  • You can retrieve max 1000 widgets. Pagination is not supported. If you have more elements on your board, you could filter by widget type and send a separate request for each type. If you use a huge board to collect the outcome of several workshops this limit could be a hurdle.

  • You can retrieve the created by and modified by metadata however the modified by describes the last modification. There is no way to retrieve the full history of a widget.

  • Updating the App Scopes had no effect. I needed to delete and then recreate the App defining new scopes.

Modify Data

The REST API provides methods to create and modify boards. For our replay cockpit we need first to retrieve all elements and then replay them in the correct order. My assumption that everything I retrieve can be posted again to another board was refuted. The following limitations have been observed

  • Deleting boards is not possible. This lacks the ability during the development phase to create boards and automatically delete them after each iteration. My workaround was to delete all widgets on the cleanup step, with the side effect that the board crashed after several iterations (probably too many elements in history)

  • Styles: Not each style property can be set in a POST request even if was delivered by the GET request.

  • Once you retrieve colors they are in a 8 digits format (RGB including opacity). Write requests accept only 6 digit formats (RGB)

  • 12 colors are support for stickers however on the board you could assign custom colors to stickers. I wrote therefore a color distance calculator to map a custom color to the nearest one of these 12

Actually I do not understand some of these limitations and it took several runs and hacks to copy the content from one board to another board.

Replay Architecture

I have set up a vue.js application with a python backend. The backend is due to convenience reasons and to prevent CORS issues. The user then can select a board. All board content is loaded chronological into a slider and the user can navigate through the single steps. Once the autoplay feature is enabled, you can go to the replay Miro board and enjoy the replay show. For each replay step either a create widget REST request is sent or a delete widget request (if you navigate backwards)

SDK

As we could solve our replay cockpit using the API we searched for a small use case to evaluate the Miro SDK. Wouldn’t it be nice to implement a word counter which shows you within a widget the most used words and once you click on the word all occurrences on the board will be highlighted.

Your Code

Miro plugins are loaded from a public accessible source (your code) and embedded into the miro site via an IFrame. Therefore you need to setup an HTML site and publish it to some publicly available location, in my case github pages. In the simplest form you provide a html page which loads some javascript code.

let icon =
    '<circle cx="12" cy="12" r="9" fill="none" fill-rule="evenodd" stroke="currentColor" stroke-width="2"></circle>'
miro.onReady(() => {
    miro.initialize({
        extensionPoints: {
            bottomBar: {
                title: 'Word count',
                svgIcon: icon,
                positionPriority: 1,
                onClick: () => {
                    miro.board.ui.openLeftSidebar('/my-first-miro-plugin/wordCount.html')
                },
            },
        },
    })
})

This simple javascript file defines an action tray icon and assigns the content of wordCount.html to the sidebar on the left.

 

Once you click on the icon the content of wordCount.html will be loaded to the left sidebar.


var tokens = new Object();

function addToWordCounts(element) {
    var text = normalizeTitle(element['text'])
    if (text == '')
        return;

    let parts = text.split(' ');
    for (const part of parts) {
        if (part in tokens) {
            tokens[part].elements.push(element)
            tokens[part].count++;
        } else {
            let newElement = new Object();
            newElement.elements = [element];
            newElement.count = 1;
            newElement.text = part;
            tokens[part] = newElement;
        }
    }
}

async function calculateWordCounts() {
    let widgets = await miro.board.widgets.stickers.get()
    tokens = new Object();
    for (const widget of widgets) {
        addToWordCounts(widget)
    }
}

miro.onReady(() => {
    // subscribe on user selected widgets
    //miro.addListener(miro.enums.event.SELECTION_UPDATED, getWidget)
    calculateWordCounts()
})

Miro provides the miro instance where you can hook to miro events and call miro functions to access and modify board content.

Finally you need to setup the application in the miro settings. Same as for REST API access you need to create an app, define the scope and permissions and additionally specify the public available source code URL.

SDK - Limits and first impressions

There was not the coding experience I am used to, especially if you are not familiar with the SDK functions and you need to try out things. Anytime you update your code you need to reload the miro board, click on your icon and then in worst case click through some workflows to reach the point which you want to test. As there is no local Miro dev environment provided the whole dev cycle gets bumpy. I worked a lot with the dev tools to find out possible functions on the current miro state and also to call them to test the outcome before I deploy the new code fragment to my public location.

Unfortunately the full history of a sticker cannot be gathered. As a workaround we had the idea to record changes and write them to local storage or an external destination. However, Miro’s change events are triggered once you create a sticker but not once a text is changed.

Summary

Miro REST API is a straightforward way to retrieve and modify basic Miro data. Unfortunately it is still under development and important data such as history cannot be gathered yet. If you need more control over data or even need a widget on your board, Miro SDK should be you choice. Coding within the SDK feels bumpy and there is still no way to retrieve the full history of stickers. I’m looking forward getting these features in future releases of Miro.

Rinat AbdullinRinat AbdullinBlog
Blog

Using NLP libraries for post-processing

Learn how to analyse sticky notes in miro from event stormings and how this analysis can be carried out with the help of the spaCy library.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 2

Explore functions, types, and modules in F#. Enhance your skills with practical examples and insights in this detailed guide.

Aqeel AlazreeBlog
Blog

Part 2: Data Analysis with powerful Python

Analyzing and visualizing data from a SQLite database in Python can be a powerful way to gain insights and present your findings. In Part 2 of this blog series, we will walk you through the steps to retrieve data from a SQLite database file named gold.db and display it in the form of a chart using Python. We'll use some essential tools and libraries for this task.

Christian FolieBlog
Blog

Designing and Running a Workshop series: An outline

Learn how to design and execute impactful workshops. Discover tips, strategies, and a step-by-step outline for a successful workshop series.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 7

Explore LINQ and query expressions in F#. Simplify data manipulation and enhance your functional programming skills with this guide.

Aqeel AlazreeBlog
Blog

Part 1: Data Analysis with ChatGPT

In this new blog series we will give you an overview of how to analyze and visualize data, create code manually and how to make ChatGPT work effectively. Part 1 deals with the following: In the data-driven era, businesses and organizations are constantly seeking ways to extract meaningful insights from their data. One powerful tool that can facilitate this process is ChatGPT, a state-of-the-art natural language processing model developed by OpenAI. In Part 1 pf this blog, we'll explore the proper usage of data analysis with ChatGPT and how it can help you make the most of your data.

Aqeel AlazreeBlog
Blog

Part 3: How to Analyze a Database File with GPT-3.5

In this blog, we'll explore the proper usage of data analysis with ChatGPT and how you can analyze and visualize data from a SQLite database to help you make the most of your data.

Daniel WellerBlog
Blog

Revolutionizing the Logistics Industry

As the logistics industry becomes increasingly complex, businesses need innovative solutions to manage the challenges of supply chain management, trucking, and delivery. With competitors investing in cutting-edge research and development, it is vital for companies to stay ahead of the curve and embrace the latest technologies to remain competitive. That is why we introduce the TIMETOACT Logistics Simulator Framework, a revolutionary tool for creating a digital twin of your logistics operation.

Sebastian BelczykBlog
Blog

Composite UI with Design System and Micro Frontends

Discover how to create scalable composite UIs using design systems and micro-frontends. Enhance consistency and agility in your development process.

Nina DemuthBlog
Blog

Trustbit ML Lab Welcomes Grayskull e150 by Tenstorrent

Discover how Trustbit ML Lab integrates Tenstorrent's Grayskull e150, led by Jim Keller, for cutting-edge, energy-efficient AI processing.

Sebastian BelczykBlog
Blog

Building and Publishing Design Systems | Part 2

Learn how to build and publish design systems effectively. Discover best practices for creating reusable components and enhancing UI consistency.

Chrystal LantnikBlog
Blog

CSS :has() & Responsive Design

In my journey to tackle a responsive layout problem, I stumbled upon the remarkable benefits of the :has() pseudo-class. Initially, I attempted various other methods to resolve the issue, but ultimately, embracing the power of :has() proved to be the optimal solution. This blog explores my experience and highlights the advantages of utilizing the :has() pseudo-class in achieving flexible layouts.

Rinat AbdullinRinat AbdullinBlog
Blog

Consistency and Aggregates in Event Sourcing

Learn how we ensures data consistency in event sourcing with effective use of aggregates, enhancing system reliability and performance.

Ian RusselIan RusselBlog
Blog

So, I wrote a book

Join me as I share the story of writing a book on F#. Discover the challenges, insights, and triumphs along the way.

Daniel PuchnerBlog
Blog

Make Your Value Stream Visible Through Structured Logging

Boost your value stream visibility with structured logging. Improve traceability and streamline processes in your software development lifecycle.

Christian FolieBlog
Blog

The Power of Event Sourcing

This is how we used Event Sourcing to maintain flexibility, handle changes, and ensure efficient error resolution in application development.

Matus ZilinskyBlog
Blog

Creating a Social Media Posts Generator Website with ChatGPT

Using the GPT-3-turbo and DALL-E models in Node.js to create a social post generator for a fictional product can be really helpful. The author uses ChatGPT to create an API that utilizes the openai library for Node.js., a Vue component with an input for the title and message of the post. This article provides step-by-step instructions for setting up the project and includes links to the code repository.

Christian FolieBlog
Blog

Running Hybrid Workshops

When modernizing or building systems, one major challenge is finding out what to build. In Pre-Covid times on-site workshops were a main source to get an idea about ‘the right thing’. But during Covid everybody got used to working remotely, so now the question can be raised: Is it still worth having on-site, physical workshops?

Jonathan ChannonBlog
Blog

Tracing IO in .NET Core

Learn how we leverage OpenTelemetry for efficient tracing of IO operations in .NET Core applications, enhancing performance and monitoring.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F#

Dive into functional programming with F# in our introductory series. Learn how to solve real business problems using F#'s functional programming features. This first part covers setting up your environment, basic F# syntax, and implementing a simple use case. Perfect for developers looking to enhance their skills in functional programming.

Christian FolieBlog
Blog

Designing and Running a Workshop series: The board

In this part, we discuss the basic design of the Miro board, which will aid in conducting the workshops.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 9

Explore Active Patterns and Computation Expressions in F#. Enhance code clarity and functionality with these advanced techniques.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 3

Dive into F# data structures and pattern matching. Simplify code and enhance functionality with these powerful features.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 10

Discover Agents and Mailboxes in F#. Build responsive applications using these powerful concurrency tools in functional programming.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 8

Discover Units of Measure and Type Providers in F#. Enhance data management and type safety in your applications with these powerful tools.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 11

Learn type inference and generic functions in F#. Boost efficiency and flexibility in your code with these essential programming concepts.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 12

Explore reflection and meta-programming in F#. Learn how to dynamically manipulate code and enhance flexibility with advanced techniques.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 4

Unlock F# collections and pipelines. Manage data efficiently and streamline your functional programming workflow with these powerful tools.

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 6

Learn error handling in F# with option types. Improve code reliability using F#'s powerful error-handling techniques.

Daniel PuchnerBlog
Blog

How we discover and organise domains in an existing product

Software companies and consultants like to flex their Domain Driven Design (DDD) muscles by throwing around terms like Domain, Subdomain and Bounded Context. But what lies behind these buzzwords, and how these apply to customers' diverse environments and needs, are often not as clear. As it turns out it takes a collaborative effort between stakeholders and development team(s) over a longer period of time on a regular basis to get them right.

Laura GaetanoBlog
Blog

5 lessons from running a (remote) design systems book club

Last year I gifted a design systems book I had been reading to a friend and she suggested starting a mini book club so that she’d have some accountability to finish reading the book. I took her up on the offer and so in late spring, our design systems book club was born. But how can you make the meetings fun and engaging even though you're physically separated? Here are a couple of things I learned from running my very first remote book club with my friend!

Ian RusselIan RusselBlog
Blog

Introduction to Functional Programming in F# – Part 5

Master F# asynchronous workflows and parallelism. Enhance application performance with advanced functional programming techniques.

Bernhard SchauerBlog
Blog

ADRs as a Tool to Build Empowered Teams

Learn how we use Architecture Decision Records (ADRs) to build empowered, autonomous teams, enhancing decision-making and collaboration.

Rinat AbdullinRinat AbdullinBlog
Blog

State of Fast Feedback in Data Science Projects

DSML projects can be quite different from the software projects: a lot of R&D in a rapidly evolving landscape, working with data, distributions and probabilities instead of code. However, there is one thing in common: iterative development process matters a lot.

Sebastian BelczykBlog
Blog

Building a micro frontend consuming a design system | Part 3

In this blopgpost, you will learn how to create a react application that consumes a design system.

Rinat AbdullinRinat AbdullinBlog
Blog

Let's build an Enterprise AI Assistant

In the previous blog post we have talked about basic principles of building AI assistants. Let’s take them for a spin with a product case that we’ve worked on: using AI to support enterprise sales pipelines.

Rinat AbdullinRinat AbdullinBlog
Blog

So You are Building an AI Assistant?

So you are building an AI assistant for the business? This is a popular topic in the companies these days. Everybody seems to be doing that. While running AI Research in the last months, I have discovered that many companies in the USA and Europe are building some sort of AI assistant these days, mostly around enterprise workflow automation and knowledge bases. There are common patterns in how such projects work most of the time. So let me tell you a story...

Rinat AbdullinRinat AbdullinBlog
Blog

Event Sourcing with Apache Kafka

For a long time, there was a consensus that Kafka and Event Sourcing are not compatible with each other. So it might look like there is no way of working with Event Sourcing. But there is if certain requirements are met.

Nina DemuthBlog
Blog

7 Positive effects of visualizing the interests of your team

Interests maps unleash hidden potentials and interests, but they also make it clear which topics are not of interest to your colleagues.

Aqeel AlazreeBlog
Blog

Part 4: Save Time and Analyze the Database File

ChatGPT-4 enables you to analyze database contents with just two simple steps (copy and paste), facilitating well-informed decision-making.

TIMETOACT GROUP
News
News

Proof-of-Value Workshop

Today's businesses need data integration solutions that offer open, reusable standards and a complete, innovative portfolio of data capabilities. Apply for one of our free workshops!

Rinat AbdullinRinat AbdullinBlog
Blog

Machine Learning Pipelines

In this first part, we explain the basics of machine learning pipelines and showcase what they could look like in simple form. Learn about the differences between software development and machine learning as well as which common problems you can tackle with them.

Rinat AbdullinRinat AbdullinBlog
Blog

Trustbit LLM Leaderboard

To address common questions concerning the integration of Large Language Models, we have created an LLM Product Leaderboard that focuses on building and shipping products.

Laura GaetanoBlog
Blog

Using a Skill/Will matrix for personal career development

Discover how a Skill/Will Matrix helps employees identify strengths and areas for growth, boosting personal and professional development.

Rinat AbdullinRinat AbdullinBlog
Blog

LLM Performance Series: Batching

Beginning with the September Trustbit LLM Benchmarks, we are now giving particular focus to a range of enterprise workloads. These encompass the kinds of tasks associated with Large Language Models that are frequently encountered in the context of large-scale business digitalization.

Aqeel AlazreeBlog
Blog

Database Analysis Report

This report comprehensively analyzes the auto parts sales database. The primary focus is understanding sales trends, identifying high-performing products, Analyzing the most profitable products for the upcoming quarter, and evaluating inventory management efficiency.

Felix KrauseBlog
Blog

License Plate Detection for Precise Car Distance Estimation

When it comes to advanced driver-assistance systems or self-driving cars, one needs to find a way of estimating the distance to other vehicles on the road.

Sebastian BelczykBlog
Blog

Building A Shell Application for Micro Frontends | Part 4

We already have a design system, several micro frontends consuming this design system, and now we need a shell application that imports micro frontends and displays them.

Felix KrauseBlog
Blog

Creating a Cross-Domain Capable ML Pipeline

As classifying images into categories is a ubiquitous task occurring in various domains, a need for a machine learning pipeline which can accommodate for new categories is easy to justify. In particular, common general requirements are to filter out low-quality (blurred, low contrast etc.) images, and to speed up the learning of new categories if image quality is sufficient. In this blog post we compare several image classification models from the transfer learning perspective.

Blog
Blog

My Workflows During the Quarantine

The current situation has deeply affected our daily lives. However, in retrospect, it had a surprisingly small impact on how we get work done at TIMETOACT GROUP Austria.

TIMETOACT GROUP
Service
Headerbild Data Insights
Service

Data Insights

With Data Insights, we help you step by step with the appropriate architecture to use new technologies and develop a data-driven corporate culture: from the development of new data sources, to exploratory analysis to gain new insights, to predictive models.

TIMETOACT GROUP
Kompetenz
Kompetenz

Artificial Intelligence & Data Strategy

Every company collects and manages vast amounts of data, e.g. from production processes or business transactions. However, only a fraction of this data is used effectively to support control and decision-making processes.

Felix KrauseBlog
Blog

Part 2: Detecting Truck Parking Lots on Satellite Images

In the previous blog post, we created an already pretty powerful image segmentation model in order to detect the shape of truck parking lots on satellite images. However, we will now try to run the code on new hardware and get even better as well as more robust results.

TIMETOACT GROUP
Branche
Branche

Digital transformation in public administration

The digital transformation will massively change the world of work, especially in public administration. We support federal, state and local authorities in the strategic and technical implementation of their administrative modernisation projects.

Rinat AbdullinRinat AbdullinBlog
Blog

Innovation Incubator at TIMETOACT GROUP Austria

Discover how our Innovation Incubator empowers teams to innovate with collaborative, week-long experiments, driving company-wide creativity and progress.

Felix KrauseBlog
Blog

Boosting speed of scikit-learn regression algorithms

The purpose of this blog post is to investigate the performance and prediction speed behavior of popular regression algorithms, i.e. models that predict numerical values based on a set of input variables.

Laura GaetanoBlog
Blog

My Weekly Shutdown Routine

Discover my weekly shutdown routine to enhance productivity and start each week fresh. Learn effective techniques for reflection and organization.

TIMETOACT
Service
Headerbild zu Big Data, Data Lake und Data Warehouse
Service

Big Data, Data Lake & Data Warehousing

For the optimal solution – with special consideration of the business requirements – we combine different functionalities.

Felix KrauseBlog
Blog

Part 1: Detecting Truck Parking Lots on Satellite Images

Real-time truck tracking is crucial in logistics: to enable accurate planning and provide reliable estimation of delivery times, operators build detailed profiles of loading stations, providing expected durations of truck loading and unloading, as well as resting times. Yet, how to derive an exact truck status based on mere GPS signals?

TIMETOACT GROUP
Branche
Headerbild für lokale Entwicklerressourcen in Deutschland
Branche

On-site digitization partner for insurance companies

As TIMETOACT GROUP, we are one of the leading digitization partners for IT solutions in Germany, Austria and Switzerland. As your partner, we are there for you at 17 locations and will find the right solution on the path to digitization - gladly together in a personal exchange on site.

TIMETOACT
Technologie
Headerbild Talend Data Integration
Technologie

Talend Data Integration

Talend Data Integration offers a highly scalable architecture for almost any application and any data source - with well over 900 connectors from cloud solutions like Salesforce to classic on-premises systems.

TIMETOACT
Technologie
Headerbild zu Talend Data Fabric
Technologie

Talend Data Fabric

The ultimate solution for your data needs – Talend Data Fabric includes everything your (Data Integration) heart desires and serves all integration needs relating to applications, systems and data.

Rinat AbdullinRinat AbdullinBlog
Blog

Process Pipelines

Discover how process pipelines break down complex tasks into manageable steps, optimizing workflows and improving efficiency using Kanban boards.

Peter SzarvasPeter SzarvasBlog
Blog

Why Was Our Project Successful: Coincidence or Blueprint?

“The project exceeded all expectations,” is one among our favourite samples of the very positive feedback from our client. Here's how we did it!

TIMETOACT
Service
Headerbild zu Operationalisierung von Data Science (MLOps)
Service

Operationalization of Data Science (MLOps)

Data and Artificial Intelligence (AI) can support almost any business process based on facts. Many companies are in the phase of professional assessment of the algorithms and technical testing of the respective technologies.

TIMETOACT
Service
Headerbild zu Data Governance Consulting
Service

Data Governance

Data Governance describes all processes that aim to ensure the traceability, quality and protection of data. The need for documentation and traceability increases exponentially as more and more data from different sources is used for decision-making and as a result of the technical possibilities of integration in Data Warehouses or Data Lakes.

Rinat AbdullinRinat AbdullinBlog
Blog

Announcing Domain-Driven Design Exercises

Interested in Domain Dirven Design? Then this DDD exercise is perfect for you!

TIMETOACT
Technologie
Headerbild zu Talend Real-Time Big Data Platform
Technologie

Talend Real-Time Big Data Platform

Talend Big Data Platform simplifies complex integrations so you can successfully use Big Data with Apache Spark, Databricks, AWS, IBM Watson, Microsoft Azure, Snowflake, Google Cloud Platform and NoSQL.

TIMETOACT
Technologie
Headerbild IBM Cloud Pak for Data
Technologie

IBM Cloud Pak for Data

The Cloud Pak for Data acts as a central, modular platform for analytical use cases. It integrates functions for the physical and virtual integration of data into a central data pool - a data lake or a data warehouse, a comprehensive data catalogue and numerous possibilities for (AI) analysis up to the operational use of the same.

TIMETOACT
Service
Teaserbild zu Data Integration Service und Consulting
Service

Data Integration, ETL and Data Virtualization

While the term "ETL" (Extract - Transform - Load / or ELT) usually described the classic batch-driven process, today the term "Data Integration" extends to all methods of integration: whether batch, real-time, inside or outside a database, or between any systems.

TIMETOACT GROUP
Branche
Schild als Symbol für innere und äußere Sicherheit
Branche

Internal and external security

Defense forces and police must protect citizens and the state from ever new threats. Modern IT & software solutions support them in this task.

TIMETOACT
Technologie
Headerbild IBM Cloud Pak for Data System
Technologie

IBM Cloud Pak for Data System

With the Cloud Pak for Data System (CP4DS), IBM provides the optimal hardware for the use of all Cloud Pak for Data functions industry-wide and thus continues the series of ready-configured systems ("Appliance" or "Hyperconverged System").

Rinat AbdullinRinat AbdullinBlog
Blog

5 Inconvenient Questions when hiring an AI company

This article discusses five questions you should ask when buying an AI. These questions are inconvenient for providers of AI products, but they are necessary to ensure that you are getting the best product for your needs. The article also discusses the importance of testing the AI system on your own data to see how it performs.

Rinat AbdullinRinat AbdullinBlog
Blog

Learning + Sharing at TIMETOACT GROUP Austria

Discover how we fosters continuous learning and sharing among employees, encouraging growth and collaboration through dedicated time for skill development.

TIMETOACT
Technologie
Headerbild Talend Application Integration
Technologie

Talend Application Integration / ESB

With Talend Application Integration, you create a service-oriented architecture and connect, broker & manage your services and APIs in real time.

Christoph HasenzaglChristoph HasenzaglBlog
Blog

Common Mistakes in the Development of AI Assistants

How fortunate that people make mistakes: because we can learn from them and improve. We have closely observed how companies around the world have implemented AI assistants in recent months and have, unfortunately, often seen them fail. We would like to share with you how these failures occurred and what can be learned from them for future projects: So that AI assistants can be implemented more successfully in the future!

TIMETOACT
Service
Navigationsbild zu Business Intelligence
Service

Business Intelligence

Business Intelligence (BI) is a technology-driven process for analyzing data and presenting usable information. On this basis, sound decisions can be made.

TIMETOACT
Technologie
Logo Atlassian Confluence
Technologie

Confluence from Atlassian

Create, organize, and collaborate on tasks - all in a single place. Confluence is a workspace for teams and organizations where you can store your documentation and collaboratively develop and share knowledge. Dynamic pages give your team a place to create, capture, and collaborate around projects or idea development.

Rinat AbdullinRinat AbdullinBlog
Blog

Strategic Impact of Large Language Models

This blog discusses the rapid advancements in large language models, particularly highlighting the impact of OpenAI's GPT models.

TIMETOACT
Technologie
Haderbild zu IBM Cloud Pak for Application
Technologie

IBM Cloud Pak for Application

The IBM Cloud Pak for Application provides a solid foundation for developing, deploying and modernising cloud-native applications. Since agile working is essential for a faster release cycle, ready-made DevOps processes are used, among other things.

TIMETOACT
Service
Header Konnzeption individueller Business Intelligence Lösungen
Service

Conception of individual Analytics and Big Data solutions

We determine the best approach to develop an individual solution from the professional, role-specific requirements – suitable for the respective situation!

Jörg EgretzbergerJörg EgretzbergerBlog
Blog

8 tips for developing AI assistants

AI assistants for businesses are hype, and many teams were already eagerly and enthusiastically working on their implementation. Unfortunately, however, we have seen that many teams we have observed in Europe and the US have failed at the task. Read about our 8 most valuable tips, so that you will succeed.

TIMETOACT
Technologie
Headerbild zu Cloud Pak for Data – Test-Drive
Technologie

IBM Cloud Pak for Data – Test-Drive

By making our comprehensive demo and customer data platform available, we want to offer these customers a way to get a very quick and pragmatic impression of the technology with their data.

TIMETOACT
Technologie
Headerbild für IBM SPSS
Technologie

IBM SPSS Modeler

IBM SPSS Modeler is a tool that can be used to model and execute tasks, for example in the field of Data Science and Data Mining, via a graphical user interface.

TIMETOACT
Technologie
Headerbild zu IBM Watson® Knowledge Catalog
Technologie

IBM Watson® Knowledge Catalog/Information Governance Catalog

Today, "IGC" is a proprietary enterprise cataloging and metadata management solution that is the foundation of all an organization's efforts to comply with rules and regulations or document analytical assets.

TIMETOACT
Technologie
Headerbild zu IBM Watson Studio
Technologie

IBM Watson Studio

IBM Watson Studio is an integrated solution for implementing a data science landscape. It helps companies to structure and simplify the process from exploratory analysis to the implementation and operationalisation of the analysis processes.

TIMETOACT
Technologie
Headerbild zu IBM DB2
Technologie

IBM Db2

The IBM Db2database has been established on the market for many years as the leading data warehouse database in addition to its classic use in operations.

TIMETOACT
Technologie
Headerbild zu IBM Netezza Performance Server
Technologie

IBM Netezza Performance Server

IBM offers Database technology for specific purposes in the form of appliance solutions. In the Data Warehouse environment, the Netezza technology, later marketed under the name "IBM PureData for Analytics", is particularly well known.

TIMETOACT
Technologie
Headerbild zu IBM DataStage
Technologie

IBM InfoSphere Information Server

IBM Information Server is a central platform for enterprise-wide information integration. With IBM Information Server, business information can be extracted, consolidated and merged from a wide variety of sources.

Kompetenz
Data Science & Advanced Analytics
Kompetenz

Data Science, AI & Advanced Analytics

Data Science & Advanced Analytics includes a wide range of tools that can examine business processes, help drive change and improvement.

TIMETOACT
Technologie
Headerbild zu IBM Cloud Pak for Data Accelerator
Technologie

IBM Cloud Pak for Data Accelerator

For a quick start in certain use cases, specifically for certain business areas or industries, IBM offers so-called accelerators based on the "Cloud Pak for Data" solution, which serve as a template for project development and can thus significantly accelerate the implementation of these use cases. The platform itself provides all the necessary functions for all types of analytics projects, and the accelerators provide the respective content.

TIMETOACT
Technologie
Technologie

Advice around Mendix

Develop your solutions quickly and independently in low-code with the leading technology vendor. Use the Mendix toolkit and model your applications with visual elements.

TIMETOACT
Technologie
Headerbild zu IBM Cloud Pak for Automation
Technologie

IBM Cloud Pak for Automation

The IBM Cloud Pak for Automation helps you automate manual steps on a uniform platform with standardised interfaces. With the Cloud Pak for Business Automation, the entire life cycle of a document or process can be mapped in the company.

TIMETOACT
Technologie
Logo Jira Service Management
Technologie

Jira Service Management from Atlassian

Enable developers, operators, and other teams from different departments to collaborate and improve their service management responsiveness. Respond quickly to incidents, requests, and changes, and provide your customers with an optimized service experience.

Rinat AbdullinRinat AbdullinBlog
Blog

The Intersection of AI and Voice Manipulation

The advent of Artificial Intelligence (AI) in text-to-speech (TTS) technologies has revolutionized the way we interact with written content. Natural Readers, standing at the forefront of this innovation, offers a comprehensive suite of features designed to cater to a broad spectrum of needs, from personal leisure to educational support and commercial use. As we delve into the capabilities of Natural Readers, it's crucial to explore both the advantages it brings to the table and the ethical considerations surrounding voice manipulation in TTS technologies.