THANK YOU FOR SUBSCRIBING

Evolution Of Data Management And Analysis
Matías Millán, Head Of Engineering And Data Governance, Banco Hipotecario


Matías Millán, Head Of Engineering And Data Governance, Banco Hipotecario
I started my career in Computer Science back in the 90s, and from the very beginning, I was interested in the world of data management and analysis that could provide real information to the areas responsible for management and decision-making.
In those years, as you can imagine, both data ingestion and processing, as well as the generation of KPI's and dashboards, were carried out using many techniques and knowledge of databases, such as PK's and indexes, and calculations aimed at achieving the best performance, as the tools and processing times were considerable. It was the beginning of what later became BI and is currently DA and Big Data.
Data warehouses already existed, and terminologies such as OLAP or ROLAP were being discussed, as well as multidimensional information cubes, but there were limited tools to manage them. To give you an idea of processing times, a daily cube took an average of 8 hours to generate, and one with monthly information took more than 24 hours, and in the meantime, we had to keep an eye on it to make sure the process didn't fail or stop because if it did, we would have to start all over again.
Entering the new millennium, in the year 2000, the term Business Intelligence was commonly used.
In the first decade of the 2000s, more modern databases appeared, simpler and more performant ETL processes were developed, and there was exploitation with faster response times and better visualizations. The term Data Analytics also emerged, with data modeling and machine learning, and specific tools with high processing capacity started to appear.
In the last decade, the focus shifted towards data cloud environments, which brought about changes in regulations, data security, types of services to be used, hardware structures, equipment, and the transformation of human resources towards these new systems.
Entering the new millennium, in the year 2000, the term Business Intelligence was commonly used. The methodology and data management definitions remained similar, but with more modern hardware and software, performance and data availability improved substantially
Currently, these paradigm shifts and associated technologies are no longer as progressive but rather much faster. We have reached a point where if planning and implementing a data ecosystem takes too long, it will become obsolete before it even goes into production. The new trends involve MVP-style developments to test, use, and constantly adapt them while validating if these new models are truly the most convenient ones.
New services and tools are constantly being adopted, such as AI, are being made available.
In summary, as it happens in various aspects of life, everything has become a constantly changing flow and demand.
The challenge for humans is to be able to adapt to these new times where we need to be constantly connected, updated, and skilled.