Data products: what are they and what are their characteristics?

Data products: what are they and what are their characteristics?

Unsurprisingly, data is a valuable resource when it comes to software and business. Digital products that involve data are present in our daily lives, whether at work or leisure, as in social networks, for example. Some tools use data to improve our user experience, others use it to prioritize the development of new features, or even recommend which shoes fit perfectly with what we were looking to buy. However, some products facilitate an end goal through data and it is this type of product that we will address in this text, the Data Products.

What are Data Products?

In the article “The Age of the Data Product”, data scientist Benjamin Bengfort defines data products as self-adapting and widely applicable economic mechanisms that derive their value from data, as well as generating more data capable of influencing human behavior or making inferences and predictions. For DJ Patil, writer of the book “Data Jujitsu: The Art of Turning Data into Product”, data products facilitate an end goal through the use of data. In the article “What is Data Science?”, Mike Loukides argues that a data application acquires its value from the data itself and, as a result, creates more data. It’s not just an app with data; it is a data product.

In this way, data products can be used in different areas, contexts, and industries, but as we have seen in the definitions, they need to aim at a specific result, for example: increasing data accessibility, enabling business insights, democratizing access to data, save resources, etc. To achieve this result, the product can provide raw data, transformed data, algorithms, predictions, decision support, or any other output that involves extracting value from the available data.

Data products can be a competitive differentiator for companies, as they assist in decision-making and generate business guidelines for each context if development is oriented to answering strategic questions. Therefore, this type of product requires a team that has knowledge not only of business and software development but also of analysis, science, and data engineering. Furthermore, the planning of the construction of the product must take into account the steps that involve cleaning, analysis, transformation, data mining, and construction of statistical models, which makes the whole process more complex and must be done very carefully.

Differences between “Data as a Product” and “Data Product”

A common misconception is that “data as a product” and “data product” are the same, but they are not. As previously mentioned, data products are products that use data to solve a specific problem, while “data as a product” (DaaP) are a subset of this type of product. Going into more detail, DaaP refers to data products that have data, raw or derived, as the final deliverable of the solution.

In his book “Data Mesh”, Zhamak Dehghani says that teams that provide data must apply the product paradigm to the data sets they provide. Thus, it is ensured that this data transfer has a better Discovery, more security, reliability, etc. Some examples of DaaP would be the construction of a data warehouse, the development of an API that has the purpose of taking data from one environment to another, or even the export and import of transformed data to some file system.

Important features of Data Products

In addition to deriving their value from data, data products need to give back to the context in which they are a part, whether solving a problem, optimizing processes, or generating important insights. For this, this type of product is capable of making inferences and predictions about the business to help users make decisions and is also capable of self-adaptation. That is, the product evolves with the result itself and as the number of users and data available increases.

Apart from these characteristics, there are qualities that a good data product should have, such as:

  • Reliability: This type of product needs even more robust tools to identify performance issues as it sometimes handles large amounts of data and can be slow or unstable if not carefully developed.
  • Security: This is an essential attribute when dealing with data, as it is up to those responsible to protect the product from possible vulnerabilities and attacks. In addition, you must comply with the laws and regulations established by the government, such as the LGPD, and by the companies involved.
  • Scalability: The product must be planned to facilitate its growth, because, in addition to the number of users, it is possible that, as the product evolves, the amount of data and the storage and processing tools need to be prepared for that.
  • Good Discovery and Planning: Like any product, the discovery and planning steps are critical to the success of data products. However, for this type of product, attention needs to be even greater, since the research involves not only the business but also the data that the business has to offer to build the desired solution.

Examples of Data Products 

In addition to the raw or derived data that have already been mentioned in the text, there are other examples of data products that may even be present in our daily lives, such as transportation applications, which show the best possible path to take us to our destination, or even those lists of recommended songs that fit our tastes very well and we didn’t know them yet, and that appear in audio streams. The truth is that, if we stop to observe, data products are dominating the markets and making our daily lives much easier.For businesses, dashboards are a good example. Typically, they are used by companies to measure their metrics and assist in decision-making. Social networks themselves provide dashboards for companies, brands, or influencers to generate insights into views, followers, and clicks. Some products provide message and email automation to facilitate user management and that can increase sales if used well. Here at Aquarela we also develop products that use data and artificial intelligence to bring value to companies.

Conclusion

Data products are products that derive their value from data through insights or predictions. These products are present in our daily lives in several applications that we use and are indispensable especially when we talk about business, as they can be the differential to take companies to another level.

In other words, if the company seeks continuous improvement of its processes, it is essential that it uses data in its favor. For this, it is important to look for the characteristics and qualities described in the text in the products, since this product model normally deals with sensitive information, must respect a series of standards, and needs to perform well to provide a solution that really generates a positive impact in business.

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace & defence), Scania and Randon Group (automotive), Solar Br Coca-Cola (beverages), Hospital das Clínicas (healthcare), NTS-Brasil (oil & gas), Votorantim Energia (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Author

DALL-E 2 Artificial Intelligence and its impact on design

DALL-E 2 Artificial Intelligence and its impact on design

The Artificial Intelligence DALL-E 2, created by the company OpenAI, is a system that transforms images and texts into new images by synthesizing 12 billion parameters inherited by GPT-3. The model is trained using a dataset of text-image pairs. DALL-E 2 is the second version of this AI, and is being developed from the first DALL-E version, released in January 2021, being disruptive but still with several limitations and training needs. Now in the second version, it is possible to generate images with greater correspondence in 71.7% of the time in relation to the caption provided by the user and image resolution up to four times greater than with the original version.

The use of this AI in Design applications will bring numerous application possibilities. Being an Artificial Intelligence created with the aim of helping and enabling people to express creativity, several areas of Design will be affected.

Graphic Design and Image Database/Image Bank

The use of DALL-E 2 as an image bank resource is a favorable future alternative for Design and Advertising companies and agencies. This is due to the system’s extraordinary ability to replicate, transform and synthesize images, as well as include and exclude elements according to the user’s needs. However, consent is still awaited for the images to be used in the market, since, as it is under development and in an experimental process, only individual use and for non-commercial purposes is allowed.

Product and Interior Design

The system’s contribution to several areas of Product Design and even Environments is rich. The use by professionals to create MoodBoards and generate new ideas can be an alternative for process optimization. Even so, the possibilities of surrealist images can cause confusion on the part of non-professionals, believing in the possibility in situations that do not correspond with the reality of application.

Usage Policies and Licenses

As a system capable of generating a huge diversity of realistic images, OpenAI reinforces its commitment to create an AI for the good of society, working on the new version already in testing so that some content is removed from training, limiting DALL-E 2 so that it does not generate adult, hateful or violent content, among other categories. The ability for realistic reproductions of real individuals, including public figures, is also being avoided.

While an Artificial Intelligence is still in testing and research, the possibilities of licenses for the use of images generated with DALL-E 2 are not known for sure.

Conclusion

As presented, the DALL-E 2 has several indications for improvements and implications when used in Design, it can be used as an excellent tool for professionals, educational institutions and even companies in general. Despite expectations being high and optimistic about the use and future of the system, it is still uncertain how it can be exploited.

Did you like the article? So, leave your comment.

Learn more about IA applications on Industry 4.0.

References

https://thehardcopy.co/what-does-dall-e-mean-for-the-future-of-design/

https://openai.com/blog/dall-e/

https://openai.com/dall-e-2/

https://www.showmetech.com.br/inteligencia-artificial-cria-imagens/

https://labs.openai.com/policies/content-policy

https://openai.com/api/

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace & defence), Scania and Randon Group (automotive), Solar Br Coca-Cola (beverages), Hospital das Clínicas (healthcare), NTS-Brasil (oil & gas), Votorantim Energia (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Authors

The rise of the Self-taught Programmer

The rise of the Self-taught Programmer

The desire to become a self-taught programmer or developer is at an all-time high right now and the pandemic is partially to blame for the rapid growth of this profession. During the pandemic, a lot of physical jobs were lost, but the Tech industry experienced an immense amount of growth in revenue and job opportunities, and these opportunities started to attract unemployed people or just ordinary people looking to get a slice from the industry. 

The Tech industry comes with really good if not one of the best working conditions and benefits ever. The most famous one being the benefit of working from home or what is also known as the “Home office”. 

With all these shiny benefits, people started looking for easier ways to join the tech industry, basically, without going through the hassle of paying for universities and/or colleges and having to study for years and years and this resulted in the explosion in the number of self-taught programmers.

What does it mean to be a Self-Taught programmer?

When you go to a University or college, you have a fixed curriculum or a ‘roadmap’ that shows you exactly what to study, in which order, and how to go about doing it. However, when you take the self-taught path, things are extremely different, because you are choosing the roadmap yourself, maybe with the help of some friends or family members, or maybe even a quick search on Reddit or youtube, but the whole idea is that you are in charge of putting together your plan of action, which may not always be the best of plans, however when that plan succeeds you can gladly call yourself a “Self-taught” programmer. 

The challenges

Although it seems easy to many, being self-taught is arduous because you are constantly in battle with your doubts, with exhausting unpredictability and uncertainty.  It takes time, patience, continuous learning, doing extensive research, building projects, and a lot of failing to become a self-taught programmer, but during this whole process, you are creating or building what is referred to as a “coding muscle”.

I remember back in early 2019 when I decided to embark on the programming journey, full of excitement and cheer, ready to change the world with code, but little did I know of what was in store for me. The process was very daunting, I was doubting myself almost every day during those early stages, I would find myself asking questions like who am I to do this? I am over 30 already and without any college or university degree, so where exactly do I fit in this vast world of programming, which programming language should I learn, do I want to learn back-end or front-end? and the list went on. I am pretty sure if you are a self-taught programmer, then some of those questions might be familiar because those are just some of the stages most self-taught programmers go through.

Why you should hire self-taught programmer

Well, self-taught programmers may not have the necessary diplomas or degrees in the programming field, but I can assure you that they can outwork, outthink and outmaneuver many varsity or college graduates.

  • They have vigor, passion, and a huge inner drive to achieve 

For starters, if you are teaching yourself to code, you should either really love it or you must really want it with your whole being because it takes time, a huge amount of patience, dedication, a lot of guts, and just an immense work ethic. Most self-taught programmers possess all of these traits and much more.

  • They have support and know where to get information.

Although it might seem like a lonely journey for many, self-taught programmers actually often form part of a community, where they share their problem-solving skills and ideas with each other, and this can be an advantage for the employer because he is not only hiring one programmer but that programmer comes with a whole community of developers who possess various forms expertise in different fields or technologies, that the programmer can always tap into.

  • Always ready to go

All new employees need to go through the onboarding and training phases respectively because it is a vital experience for the employee, but it also gets expensive the more it drags on. Being self-taught mostly but not often means that you have a decent amount of real-world experience, which you picked up along your learning journey, be it in collaborative projects or freelancing gigs. So with that experience, the developer will most likely be ready to start coding in less time and with minimal training – Often saving the company time and money.

  • When all fails, they always have plan C, D, E, and more if need be

Self-taught developers are skilled problem solvers, every great developer has an extensive history of solving problems. Universities give programmers a solid base in theory, but theory goes out of the window when you encounter real-life coding problems or challenges.

A fundamental part of self-teaching is knowing how to untangle yourself when you are stuck in a situation, identifying problems, solving them, and learning from the process.

Read too: Industry 4.0: Web 3.0 and digital transformation

Conclusion

I hope this text doesn’t sound one-sided or maybe in favor of the self-taught programmer as opposed to the traditional varsity or college-educated programmer, but take it with a grain of salt. Studies have shown that happy employees are up to 13% more productive (according to the University of Oxford)  and self-taught developers are passionate about what they do, so there is no doubt that this is an advantage for the company. With all that said, I think we can all agree that the self-taught programmer is here to stay!  🎓

Did you like the article? Leave your comment.

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace & defence), Scania and Randon Group (automotive), Solar Br Coca-Cola (beverages), Hospital das Clínicas (healthcare), NTS-Brasil (oil & gas), Votorantim Energia (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Author

AI and Analytics strategic planning: concepts and impacts

AI and Analytics strategic planning: concepts and impacts

The benefits and positive impacts of the use of data and, above all, artificial intelligence are already a reality in the Brazilian Industry. These benefits are most evident in areas ranging from dynamic pricing in education, forecasting missed medical appointments, predicting equipment breakdowns, and even monitoring the auto parts replacement market. However, to achieve these benefits, organizations need to reach a level of analytical maturity that is adequate for every challenge they face.

In this article, we are going to discuss the concepts of AI and Analytics Strategic Planning and also look at which characteristics of the scenarios demand this type of project within the Digital Transformation journey of companies towards Industry 4.0.

What is AI and Analytics strategic planning?

The AI ​​and Data Analytics Strategic Planning is a structuring project that combines a set of elaborate consultative activities (preferably by teams with an external view of the organization) for the survey of scenarios, mapping of analytical processes, elaboration of digital assets (systems, databases, and others) to assess the different levels of analytical maturity of teams, departments and the organization as a whole.

As a result, shared definitions of the vision, mission, values, policies, strategies, action plans, and good data governance practices are accomplished to leverage the organization’s analytical maturity level in the least possible time and cost.

Symptoms of low analytic maturity scenarios

Although there are many types of businesses, products, and services on the market, here we present emerging patterns that help to characterize the problem of companies analytical maturity and can generate interesting reflections:

  1. Is it currently possible to know which analytics initiatives (data analytics) have already taken place and are taking place? Who is responsible? And what were the results?
  2. In analytics initiatives, is it possible to know what data was used and even reproduce the same analysis?
  3. Does data analysis happen randomly, spontaneously, and isolated in departments?
  4. Is it possible to view all data assets or datasets available to generate analytics?
  5. Are there situations in which the same indicator appears with different values ​​depending on the department in which the analysis is carried out?
  6. Are there defined analytic data dictionaries?
  7. What is the analytical technology stack?
  8. Are data analytics structuring projects being considered in strategic planning?

Other common problems

Organizational identity

Scenarios with low analytic maturity do not have data quality problems in isolation. There are usually systemic problems that involve the complexity of business processes, the level of training of teams, knowledge management processes, and finally, the choice of technologies for operating ERP, CRM, SCM and how these transactional systems are related.

Security Issues

Companies are living organisms that constantly evolve with people working in different areas. Thus, over time, control of the access levels of each employee is lost, causing unauthorized people to have access to sensitive information and also the opposite when people cannot access the data they need for their work.

Excessive use of spreadsheets and duplicates

Spreadsheets are one of the most useful and important management tools and for that reason, they are always helping in various processes. The big side effect of excessive use of spreadsheets is the maintenance of knowledge of each process. When there are two or more people and the volume of information and updates starts to grow, it becomes difficult to manage the knowledge that travels in blocks with spreadsheets. Additionally, many duplications occur and make it virtually impossible to securely consolidate data in large volumes.

What are the benefits of AI and Analytics strategic planning?

Data-driven management is expected to provide not just drawings and sketches of operations or market conditions, but a high-resolution photograph of present and future reality. Thus, it provides subsidies for corporate strategic planning in the short, medium, and long term with the following gains:

  • Procedural and technological readiness for data lakes projects and Advanced Analytics and AI labs.
  • Increased intensity of application of scientific techniques to businesses, such as comparative analysis, scenario simulations, identification of behavior patterns, demand forecasting, and others.
  • Increased accuracy of information.
  • Security of access to information at different levels.
  • Acceleration of the onboarding processes (entry of new team members) who in turn learn more quickly the work scenario and also begin to communicate more efficiently.
  • Greater data enrichment from increased interaction of teams from different sectors for analytical challenges.
  • Increased visibility into analytics operations, Organization for localizability, accessibility, interoperability, and reuse of digital assets.
  • Optimized plan of change for data-driven Corporate Governance.
  • Incorporation of Analytical and AI mindset in different sectors.
  • Homogenization of data policies and controls.

AI and Analytics strategic planning – Conclusions and recommendations 

The preparation of strategic AI and Analytics planning is an important step to reach the level of data governance that allows the intensive use of analytics and artificial intelligence in operations since the high failure rate of analytical projects is linked to low quality of data, processes, and even the correct use of technologies (training).

Structuring projects, such as AI strategic planning and Analytics are, or at least should be, the first step in the journey of digital transformation of traditional companies. Therefore, we are convinced that in the future every successful company will have a clear and shared idea (vision, mission, and values) of what data means to them and their business model, in contrast to investments in data technology purely and simply because of the competition.

We believe that the focus on orchestrated (tidy and synchronized) data will be reflected in almost every area, for example: in the range of services, in revenue models, in key resources, processes, cost structures, in your corporate culture, in your focus on clients and networks, and in its corporate strategy.

Last but not least, it is worth pointing out that, for a successful structuring to happen, a long-term holistic approach must be taken. This means investments in optimized technology, people, and processes to enable continued business growth.

How Aquarela has been acting

Developing new technologies and new data-driven business models in a vision that the amount and availability of more data will continue to grow, taking the business to new heights of optimization.

What we do specifically for companies:

  • We analyze data-generating enterprise ecosystems.
  • We determine analytic maturity and derive action fields for data-driven organizations and services.
  • We develop and evaluate data-based services.
  • We identify and estimate the data’s potential for future business models.
  • We design science-based digital transformation processes and guide their organizational integration.

For more information – Click here.

Did you like the article? Leave your comment.

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace & defence), Scania and Randon Group (automotive), Solar Br Coca-Cola (beverages), Hospital das Clínicas (healthcare), NTS-Brasil (oil & gas), Votorantim Energia (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Authors

Automotive Aftermarket: AI in the auto parts industry

Automotive Aftermarket: AI in the auto parts industry

Do you know the Aftermarket market segment of the auto parts industry? This term refers to the automotive aftermarket segment, which supports the continuity of operations of approximately 42.6 million vehicles (motorcycles, cars, trucks and even agricultural machinery) in Brazil. The turnover of this industrial segment ranges between 75 and 85 billion reais per year (data by Issuu).

The automotive aftermarket process involves a large amount of data on a large number of parts (SKUs) produced and sold over decades. This usually makes it difficult to identify market share and new business opportunities. But, how to overcome this challenge?

Market share case on the auto replacement aftermarket

To answer this question, we prepared a material presenting our success story in the automotive sector in the Aftermarket segment, in which we show how advanced analytics and artificial intelligence strategies can result in great benefits to the commercial operation.

The case study addresses our client’s business problem, immersed in several challenges, such as the difficulty in understanding the sales behavior of some part groups; the journey, marked by the development of a system capable of presenting the evolution of the organization’s market share; and the results generated for our client.

Our goal is to assist marketing managers, commercial managers and administrators who work in large-scale operations.

Automotive Aftermarket – Conclusion

Identifying market share and new business opportunities in the auto parts sector is a challenge, but that can be overcome through tools like AI and Advanced Analytics.

However, its implementation process is complex, demanding artificial intelligence as well as qualified, market-recognized data analytics providers.

Also read – How to choose the best AI and Data Analytics provider? 

Do you have any doubts about our success case in the automotive aftermarket? So leave your comment.

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace & defence), Scania and Randon Group (automotive), Solar Br Coca-Cola (beverages), Hospital das Clínicas (healthcare), NTS-Brasil (oil & gas), Votorantim Energia (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Authors

Geographic Normalization: what is it and what are its implications?

Geographic Normalization: what is it and what are its implications?

There is great value in representing reality through visualizations, especially spatial information. If you’ve seen a map, you know that the polygons that make up the political boundaries of cities and states are generally irregular (see Figure 1a). This irregularity makes it difficult to conduct analyzes and, therefore, cannot be dealt with by traditional Business Intelligence tools.

Notice the green dot in Figure 1b, it is over the polygon (‘neighborhood’) n.14, located between n.16 and n.18. So answer now: which region is having the greatest influence on the green dot? Is it neighborhood n.16 or n.18? Is the green dot representative of region n.14, region n.16 or n.18?

To answer questions like these and to minimize the bias generated by visualizations with irregular polygons, the Vortx Platform does what is known as Geographic Normalization, transforming irregular polygons into polygons of a single size and regular shape (see Figure 1c).

After the “ geographic normalization ”, it is possible to analyze the data of a given space by means of absolute statistics, not only relative, and without distortions caused by polygons of different sizes and formats.

normalização geográfica - mapa Florianópolis
Figure 1 – Source: Adapted from the Commercial and Industrial Association of Florianópolis – ACIF (2018)

Every day, people, companies and governments make countless decisions considering the geographic space. Which gym is closest to home for me to enroll? Where should we install the company’s new Distribution Center? Or, where should the Municipality place the health centers?

So, in today’s article, we propose two questions:

  1. What happens when georeferenced information is distorted?
  2. How close can our generalizations about space get?

Geographic standardization

Working with polygons and regions

Recalling that the concept of polygon is derived from geometry, being defined as: “a flat, closed figure formed by straight line segments”. When the polygon has all equal sides and, consequently, all equal angles, we can call it a regular polygon. When this does not happen, it is defined as an irregular polygon.

We use the political division of a territory to understand its contrasts, usually delimiting between Nations, States and Municipalities, for example, but we can also delimit regions according to several characteristics, such as the Caatinga region, the Amazon Basin region and even the Eurozone or Trump and Biden voter zones. Anyway, it is only necessary to surround a certain place in space by some common characteristic. Regional polygons, therefore, are widely used to represent certain regions or the organization of a territory of those regions.

Several market tools fill polygons with different shades of colors, according to the region’s data, looking for contrasts among them. But be careful! In case the sizes and shapes of the polygons are not constant, there may be geographic biases, making the visualization susceptible to misinterpretation.

Thus, the polygon approach becomes limited in the following aspects:

  • Comparisons between regions unevenly;
  • Requiredness to relativize indicators by number of population, area or other factors;
  • It does not allow more granular analyzes;
  • Demands more attention from analysts when creating statements about certain regions.

Purpose of Geographic standardization

Therefore, the reason for the existence of geographic normalization is to overcome the typical problems associated with data analysis related to irregular polygons, transforming the organization of the territory into a set of polygons (in this case, hexagons) of regular size and shape.

In the example below, we compare the two approaches:

1) Analysis with mesoregional polygons and; 2) Hexagons over the southeastern region of Brazil.

Normalização da Geografia | geographic normalization
Figure 2 – Source: Aquarela Advanced Analytics (2020)

Geographic Normalization seeks to minimize possible distortions of analysis generated by irregular polygons by replacing them with polygons of regular shape and size. This provides an elegant, eye-pleasing and precise alternative, capable of showing initially unknown patterns.

Normalization makes the definition of neighborhoods between polygons clearer and simpler, including promoting better adherence to artificial intelligence algorithms that search for patterns and events that are spatially autocorrelated.

After all, according to the First Law of Geography:

“All things are related to everything else, but things close are more related than distant things.” 

Waldo Tobler

Geographic normalization can also be done in different ways, such as by equilateral triangles or squares. However, the hexagon provides the least bias among these due to the smaller size of its side walls.

With the normalization, it is possible to summarize the statistics of points (inhabitants, homes, schools, health centers, supermarkets, industries, etc.) contained within these hexagons so that there is constancy in the area of ​​analysis and, of course, significant statistics of these summaries. Mature analytics companies, with a robust and well-consolidated datalake, have an advantage in this type of approach. Also check out our article on How to choose the best AI or data analytics provider?

Usage of normalized geography

Normalized geography can also be used through interactive maps. Maps of this type allow a very interesting level of approximation in the analyzes, as we can see in the animation below, where we show a visualization of the Vortx Platform that presents schools in the city of Curitiba, Brazil.

The darker the hexagon, the greater the number of schools. Note that we can also access other data through the pop-up and change the size of the hexagon as wished.

“The greater the amount of point data available in a region, the smaller the possible size of the hexagons”. 

Limitations of the standardized analysis

Like any representation of reality, models that use standardized analysis – although of great value in decision making – do not completely replace the illustration of spatial data in irregular polygons, especially when:

  • There is a clear political division to be considered;
  • There is no reasonable amount of data;
  • There is no consensus on the size of regular polygons.

In addition, the computational process to produce normalized maps must also be taken into consideration, since the processing of the data in this is not limited to the number of observations of the analyzed phenomenon, but also to the treatment of the geography under analysis. For example, conventional workstations can take hours to process basic geostatistical calculations for the 5573 cities in Brazil.

Geographic Normalization – Conclusions and recommendations 

In this article we explain geographic normalization, its importance, advantages and cautions for conducting spatial analyzes. In addition, we compared two important approaches to spatial data analysis. It is worth noting that these approaches are complementary in order to have a better understanding of the distribution of data on space. Therefore, we recommend viewing the analyzes in multiple facets.

We realized that, when designing the geographic space in an equitable way, a series of benefits to the analyzes becomes feasible, such as:

  • Alignment of the size of views according to business needs;
  • Adaption of the visualizations according to the availability of data;
  • Being able to make “fair” comparisons through absolute indicators of each region;
  • Observation of intensity areas with less bias;
  • Simplification of neighborhood definition between polygons, thus providing better adherence to spatial algorithms;
  • Finding patterns and events that autocorrelate in space with greater accuracy;
  • Usage of artificial intelligence algorithms (supervised and unsupervised) to identify points of interest that would not be identified without standardization. More information at: Application of Artificial Intelligence in georeferenced analyzes.

Finally, every tool has a purpose, geo-referenced visualizations can lead to bad or good decisions.

Therefore, using the correct visualization, along with the right and well-implemented algorithms, based on an appropriate analytical process, can enhance critical decisions that will lead to great competitive advantages that are so important in face of current economic challenges.

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace), Randon Group (automotive), Solar Br Coca-Cola (food), Hospital das Clínicas (health), NTS- Brazil (oil and gas), Votorantim (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Authors