Automotive Aftermarket: AI in the auto parts industry

Automotive Aftermarket: AI in the auto parts industry

Do you know the Aftermarket market segment of the auto parts industry? This term refers to the automotive aftermarket segment, which supports the continuity of operations of approximately 42.6 million vehicles (motorcycles, cars, trucks and even agricultural machinery) in Brazil. The turnover of this industrial segment ranges between 75 and 85 billion reais per year (data by Issuu).

The automotive aftermarket process involves a large amount of data on a large number of parts (SKUs) produced and sold over decades. This usually makes it difficult to identify market share and new business opportunities. But, how to overcome this challenge?

Market share case on the auto replacement aftermarket

To answer this question, we prepared a material presenting our success story in the automotive sector in the Aftermarket segment, in which we show how advanced analytics and artificial intelligence strategies can result in great benefits to the commercial operation.

The case study addresses our client’s business problem, immersed in several challenges, such as the difficulty in understanding the sales behavior of some part groups; the journey, marked by the development of a system capable of presenting the evolution of the organization’s market share; and the results generated for our client.

Our goal is to assist marketing managers, commercial managers and administrators who work in large-scale operations.

Automotive Aftermarket – Conclusion

Identifying market share and new business opportunities in the auto parts sector is a challenge, but that can be overcome through tools like AI and Advanced Analytics.

However, its implementation process is complex, demanding artificial intelligence as well as qualified, market-recognized data analytics providers.

Also read – How to choose the best AI and Data Analytics provider? 

Do you have any doubts about our success case in the automotive aftermarket? So leave your comment.

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace), Randon Group (automotive), Solar Br Coca-Cola (food), Hospital das Clínicas (health), NTS- Brazil (oil and gas), Votorantim (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Authors

AI provider? How to choose the best AI and Data Analytics provider?

AI provider? How to choose the best AI and Data Analytics provider?

Choosing a artificial intelligence provider for analytics projects, dynamic pricing, demand forecasting is, without a doubt, a process that should be on the table of every manager in the industry. Therefore, in case you are considering to speed up the process, an exit and the hiring of companies specialized in the subject.

A successful implementation of analytics is, to a large extent, a result of a well-balanced partnership between the internal teams and the teams a analytics service provider, so this is an important decision. Herein, we will cover some of key concerns.

Assessing the AI provider based on competencies and scale

First, you must evaluate your options based on the skills of the analytics provider. Below we bring some for criteria:

  • Consistent working method in line with your organization’s needs and size.
  • Individual skills of team members and way of working.
  • Experience within your industry, as opposed to the standard market offerings.
  • Experience in the segment of your business.
  • Commercial maturity of solutions such as the analytics platform.
  • Market reference and ability to scale teams.
  • Ability to integrate external data to generate insights you can’t have internally.

Whether developing an internal analytics team or hiring externally, the fact is that you will probably spend a lot of money and time with your analytics and artificial intelligence provider(partner), so it is important that they bring the right skills to your department’s business or process.

Consider all the options in the analytics offering.

We have seen many organizations limit their options to Capgemini, EY, Deloitte, Accenture and other major consultancies or simply developing internal analytics teams. Although:

But there are many other good options on the market, including the Brazilian ones which are worth paying attention to the their rapid growth. Mainly within the main technological centers of the country, such as: in Florianópolis or Campinas.

Adjust expectations and avoid analytical frustrations

We have seen, on several occasions, the frustrated creation of fully internal analytics teams, be they for configuring data-lakes, data governance, machine learning or systems integration.

The scenario for the adoption of AI is similar, at least per hour, to the time when companies developed their own internal ERPs in data processing departments. Today of the 4000 largest technology accounts in Brazil, only 4.2% maintain the development of internal ERP, of which the predominant are banks and governments, which makes total sense from the point of view of strategy and core business.

We investigated these cases a little more and noticed that there are at least four factors behind the results:

  • Non-data-driven culture and vertical segmentation prevent the necessary flow (speed and quantity) of ideas and data that make analytics valuable.
  • Projects waterfall management style performed in the same manner as if the teams where creating a physical artifacts or ERP systems, this style is not suitable for analytics.
  • Difficulty in hiring professionals with knowledge of analytics in the company’s business area together with the lack of on-boarding programs suited to the challenges.
  • Technical and unforeseen challenges happen very often, so it is necessary to have resilient professionals used to these cognitive capoeira (as we call here). Real life datasets are never ready and are as calibrated as those of the examples of machine learning of the passengers of the titanic dataset. They usually have outliers (What are outliers?), They are tied to complex business processes and full of rules as in the example of the dynamic pricing of London subway tickets (Article in Portuguese).

While there is no single answer to how to deploy robust analytics and governance and artificial intelligence processes, remember that you are responsible for the relationship with these teams, and for the relationship between the production and analytics systems.

Understand the strengths of analytics provider, but also recognize their weaknesses

It is difficult to find resources with depth and functional and technical qualities in the market, especially if the profile of your business is industrial, involving knowledge of rare processes, for instance, the physical chemical process for creating brake pads or other specific materials.

But, like any organization, these analytics provider can also have weaknesses, such as:

  • Lack of international readiness in the implementation of analytics (methodology, platform), to ensure that you have a solution implemented fast.
  • Lack of migration strategy, data mapping and ontologies
  • No guarantee of transfer of knowledge and documentation.
  • Lack of practical experience in the industry.
  • Difficulty absorbing the client’s business context

Therefore, knowing the provider’s methods and processes well is essential.
The pillars of a good Analytics and AI project are the Methodology and its Technological Stack (What is a technological stack?). Therefore, seek to understand about the background of the new provider, ask about their experiences with other customers of similar size to yours.

Also, try to understand how this provider solved complex challenges in other businesses, even if these are not directly linked to your challenge.

Data Ethics

Ethics in the treatment of data is a must have, therefore we cannot fail to highlight this topic of compliance. It is not just now that data is becoming the center of management’s attention, however new laws are being created as example of GDPR in Europe and LGPD in Brazil.

Be aware to see how your data will be treated, transferred and saved by the provider, and if his/her name is cleared on google searches of even public organizations.

Good providers are those who, in addition to knowing the technology well, have guidelines for dealing with the information of your business, such as:

  • It has very clear and defined security processes
  • Use end-to-end encryption
  • Track your software updates
  • Respect NDAs (Non-disclosure Agreements) – NDAs should not be simply standard when it comes to data.
  • All communication channels are aligned and segmented by security levels.
  • They are well regarded by the data analysis community.

Conclusions and recommendations

Choosing your Analytics provider is one of the biggest decisions you will make for your organization’s digital transformation.

Regardless of which provider you choose for your company, it is important that you assemble an external analytics consulting team that makes sense for your organization, that has a technological successful and proven business track that supports your industry’s demand.

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace), Randon Group (automotive), Solar Br Coca-Cola (food), Hospital das Clínicas (health), NTS- Brazil (oil and gas), Votorantim (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Author

What is a technological stack?

What is a technological stack?

The stack represents a set of integrated systems to run a single application without additional software. In this way and above all, one of the main goals of a technology stack is to improve communication about how an application is built. In addition, the chosen technology package may contain:

  • the programming languages ​​used;
  • structures and tools that a developer needs to interact with the application;
  • known performance attributes and limitations;
  • survey of strengths and weaknesses of the application in general.

As a rule, stacks must have a specific purpose. For instance, if we look at the the web 3.0 stack (what is web 3.0?), you will see how much different it is in relation to a data analysis stack in statistical R language. That is, the construction of a stack you should always ask: What is the underlying business purpose?

Where does this term come from?

The term comes from the software development community and along with it it is also quite common to speak of a full-stack developer.

A full-stack developer is, in turn, the professional who knows how to work in all layers of technologies of a 100% functional application.

Why is the technological stack so important?

Firstly, on the one hand, the accountant has all company transactions registered for financial management, on the other hand, developers and project leaders need the information of the development team.

Secondly, developers cannot manage their work effectively without at least knowing what is happening, what are the available technology assets (systems, databases, programming languages, communication protocols) and so on.

The technological stack is just as important as lifting inventory control from a company that sells physical products. It is in the technological stack that both the business strategy and the main learning (maturity) of system tests that the company has been through are concentrated.

The technological stack the working dictionary of developers in the same manner data analytics look at their data dictionaries to understand the meaning of variables and columns. It is an important item of maturity in the governance of organizations.

Without prior knowledge of the technological stack, management is unable to plan hiring, risk mitigation plans, plans to increase service capacity and, of course, the strategy for using data in the business area.

Technology stacks are particularly useful for hiring developers, analysts and data scientists.

“Companies that try to recruit developers often include their technology stack in their job descriptions.”

For this reason, professionals interested in advancing their careers should pay attention to the strategy of personal development of their skills in a way that is in line with market demand.

Technological stack example

The professional social network, Linkedin, for example: it is composed of a combination of structures and programming languages ​​and artificial intelligence algorithms to be online. So, here are some examples of technologies used in their stack:

Technological Stack – Linkedin for 300 million hits – Author Philipp Weber (2015)

Is there a technological stack for analytics?

Yes, currently the area of ​​analytics, machine learning, artificial intelligence are known for the massive use of techniques and technologies of information systems. Likewise, analytical solutions require very specific stacks to meet functional (what the system should do) and non-functional (how the system will do – security, speed, etc.) business requirements for each application.

As the foundation of a house, the order in which the stack is built is important and is directly linked to the maturity of the IT and analytics teams, so we recommend reading this article – The 3 pillars of the maturity of the analytics teams (in Portuguese).

In more than 10 years of research in different types of technologies, we have gone through several technological compositions until we reached the conformation of the current Aquarela Vortx platform. The main stack results for customers are:

  • Reduction of technological risk (learning is already incorporated in the stack);
  • technological update;
  • speed of deployment and systems integration (go-live);
  • maturity of the maintenance of the systems in production and;
  • the quality of the interfaces and flows in the production environment as the stack makes the maintenance of technicians’ knowledge more efficient.

Conclusions and recommendations

In conclusion, we presented our vision of the technological stack concept and how it is also important for analytical projects. Which, in turn, impacts strategic planning. Yet, it is worth bearing in mind that technological stacks are just like business, always evolving.

The success of defining successful stacks is directly linked to the maturity of the IT and analytics teams (The 3 pillars of the maturity of the analytics teams – In Portuguese).

Regardless of the sector, the decisions involved in shaping the technological stack are a factor of success or failure in IT and analytics projects. Because, they directly interfere in the operation and in the business strategy.

Finally, we recommend reading this other article on technology mitigation with support from specialized companies – (How to choose the best data analytics provider? in Portuguese).

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace), Randon Group (automotive), Solar Br Coca-Cola (food), Hospital das Clínicas (health), NTS- Brazil (oil and gas), Votorantim (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Author

Human Resources Optimised with Advanced Analytics

Human Resources Optimised with Advanced Analytics

Today we are going to present some insights related to employee’s working the satisfaction using Advanced Analytics tools and techniques. As a source for this study, we make use of the data made available on this link by the data scientist Ludovic Benistant who made important anonymizations. Some pictures have Brazilian Portuguese words, sorry about that! Let’s go!

Research Questions

Following the DCIM (Data Culture Introduction Methodology) methodology to guide this research, we came up the following questions:

  • What factors have the greatest influence on employee satisfaction?
  • What are the main satisfaction scenarios that exist?
  • What are the main patterns associated with key satisfaction scenarios?
  • What factors influence professionals to leave?

Data Characteristics

In total, 14,999 employees were evaluated, considering the following variables already sanitized by our scripts:

  • Employee satisfaction level (0 to 10) – Probably filled out by the employee;
  • Last evaluation (0 to 10) – Probably filled in by a manager;
  • Number of projects (2 to 7) – Number of projects in which the employee acted;
  • Average monthly hours (96 to 310);
  • Time spent at the company (2 to 10) – How long the person already worked in the company;
  • Whether they have had an accident at work – (Yes = 1 / No = 0);
  • Whether they have had a promotion in the last 5 years (Yes = 1 / No = 0);
  • Salary Range (Low = 1, Medium = 2, High = 3); Note: Actual values were not made available.
  • Left the company (Yes = 1 / No = 0).

Number of people per department

 

 


per-departament

 

Frequency Analysis / Distribution of Satisfaction

overal-satisfaction-level

The highest concentration of satisfaction is within the range of 7 to 9, and there are few people with satisfaction scores between 1.5 and 3.0.

Results

Ranking of Influence Factors in Work Satisfaction

By processing this dataset on VORTX Big Data algorithm

  1. Average monthly hours (50)
  2. Time spent at the company (21)
  3. Number of projects (20)
  4. Salary Range (13)
  5. Left the company (10)
  6. Whether they have had a promotion in the last 5 years (9)
  7. Whether they have had accident at work (9)

The factor “Last evaluation” had no relevant influence and it was automatically discarded by VORTX.

Satisfaction Scenarios

In the table below we have the result of the processing with the separation of employees into groups done automatically by the platform. In all, 120 groups have been found, and here we will focus on only the 20 most relevant and leave the others out as isolated cases and not the focus of the analysis.

english-table

Model Visual Validation

Typically managers, as far as we have experienced,  are not sure regarding machine’s ability automate the discovery of insights. Therefore, as proof of the model, we chose to show the raw data visually to demonstrate the insights aforementioned.

grupo-9-o-mais-insatisfeitos

The pattern of hours worked by the 588 people in scenario 9 (very dissatisfied). X Axis = Monthly working hours.

 

grupo-1

The pattern of hours worked in the largest scenario (1), which has 4085 employees, a good job satisfaction and a low level of job evasion. X Axis – Monthly working hours

In the view below, each circle represents a contributor in four dimensions:

  • The level of satisfaction on the Y axis.
  • Average hours per month on the X axis.
  • Orange colors for people who left the company and blue for those who remain.
  • Circle size represents the number of years in the company.

general-pattern

Alright, we just saw the overall pattern including the whole organization, so what would happen if we see it by the department?

accounting-and-it

managment-to-product

rd-and-support

technical

Conclusions and Recommendations

This study shed some light on the improvement of human resource management, which is at the heart of today’s businesses. Applying data analytics algorithms in this area allows automating and accelerating the process of pattern discovery in complex environments with, let’s say 50 variables or more. Here it was just a few. Meanwhile, the search for patterns in a traditional BI continues to be a purely artisanal work with a well know imitation of 4 dimensions per attempt (read more on this at Understanding the differences between BI, Big Data and Data Mining). The automation of discovery is an extremely important step in predictive analytics, in this case, the evasion of highly qualified professionals and possible dissatisfactions overlooked by management.

With VORTX’s ability to discover the different scenarios, we were able to analyze the data and conclude that:

  • People in group 1 and 2 (55% of the company) have a reasonable work satisfaction with a weekly load of 50 hours on average, without receiving promotion or suffering an accident at work.
  • The pattern persists in all departments.
  • The most satisfied groups of the 20 largest were the 7 and 10 who worked more than 247 hours a month, took on several projects but as they did not receive promotion they left the company. These people should be retained since there seams to be highly qualified.
  • Group 16 proves that it is possible to earn a good salary and be dissatisfied. These 77 people should be interviewed to identify the root cause of such unsatisfaction.
  • The cut-off line for non-company employees is: minimum 170 and maximum 238 hours worked per month.People with more than 3.5 years of work harder and are more satisfied.
  • Monthly hours above 261 resulted in very low levels of satisfaction.
  • Monthly hours below 261 with a number of projects greater than 3 turns out in high job satisfaction.
  • Scenario 15 shows the importance of promotion over the last 5 years of work.
  • The ones with more than 5 projects decrease their satisfaction, the ideal number is between 3 and 5. Of course, in this case, to better understand the indicator is necessary to better understand what the number of projects represents to different departments.

For managers, collecting as many indicators as possible is always good especially without interruption in all areas. More variables to enrich your model would be:

  • The distance between employee’s home and work.
  • The average time that is taken from home to work.
  • The number of children.
  • The number of phone calls or emails sent and received.
  • Gender and age and the reason for leaving the job.

We hope this information is useful for you guys in some way. If you find it relevant, share it with your colleagues. If in doubt, contact us! A big hug and success in developing your own HR strategy!

 

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace), Randon Group (automotive), Solar Br Coca-Cola (food), Hospital das Clínicas (health), NTS- Brazil (oil and gas), Votorantim (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Big Data Scenario Discovery, why is it super useful for decision making?

Big Data Scenario Discovery, why is it super useful for decision making?

Hi everyone, in today’s demonstration, we are going to show you how Big Data Scenario Discovery can help decision making in a profound way in various sectors. We use AQUARELA VORTX Big Data, which is a tool that is a groundbreaking technology in the machine learning field. The Dataset used for the experiment was presented in the previous post about Big Data country auto-segmentation (clustering). The differences here is that this one also includes the Gini Index (found later on) and removes the electrification rate in rural areas. Also, it seeks systemic influences towards a GOAL, in this case, we selected Human Development Index, previously the segmentation just grouped similar countries according to their general characteristics.

The key questions for the experiment:

  1. How many Human Development Index scenarios exist in total? And which countries belong to them?
  2. Amongst 65 indexes, which of them have most influence to define a High or Low Human Development Index?
  3. What is the DNA (set of characteristics) of a High and Low Human Development scenario?

Alright, hang on for a minute! Before you see the results, take a look at all variables analysed in the previous post. Then try to figure out by yourself using the most of your intuition, what would be the answer to these 3 questions. This is a very fun and very useful cognitive task to scenario validation. OK?

Results after pushing the Discoverer button:

HDI - Total

This is the overall distribution of 188 countries, where most of the countries present HDI between 0.65 and 0.75. And very few above 0.90.  In total, there are 15 different HDI scenarios, which the first 3 correspond to more than 94% of the total and that is what we are to focus on.

Scenario 1

The most common scenario and the average HDI

Scenario 2

Countries with the lowest HDI

Scenario 3

Countries with the highest HDI

Where are they located?

Screen Shot 2016-09-15 at 20.21.36

What factors influence HDI the most and the least?

Ranking

The list marks the top and bottom 10 factors. The factor Intimate or Nonintimate partner Violence ever experienced 2001-2011 – Was automatically removed from the ranking as it does not correlate with HDI.

What is the DNA of each main scenario?

Screen Shot 2016-09-15 at 19.56.15

All factors presented at once. Note that the scales on X axis changes dynamically hovering the mouse on VORTX data scope screen.

Screen Shot 2016-09-15 at 19.56.06 Screen Shot 2016-09-15 at 19.55.57

Drilling down into the DNA

Under-Five Mortality rates vs HDI

Screen Shot 2016-09-15 at 19.51.05

Screen Shot 2016-09-15 at 19.51.19

Screen Shot 2016-09-15 at 19.51.30

Filtering visualisation by the most relevant factor and HDI (HDI is the focus of the analytics so it has the darker colour. Here we see that countries with the highest HDI have lowest levels of under-five mortality rate.

Gender Inequality Rate vs HDI

Screen Shot 2016-09-15 at 19.55.12

Screen Shot 2016-09-15 at 19.55.31

Screen Shot 2016-09-15 at 19.55.41

Gross National Income GNI per capta vs HDI

Screen Shot 2016-09-15 at 19.53.38 Screen Shot 2016-09-15 at 19.53.25 Screen Shot 2016-09-15 at 19.53.15

Insights and Conclusions of the study

The possibilities generating new knowledge from this Big Data strategy are endless, but we focused on just a few questions and few print screens to demonstrate its value. During this research, we found interesting to see the machine autonomously confirming some previous intuitions, while breaking some preconceptions. It is important to mention that we are not measuring causation as if one factor leads to another and vice-versa, the results show systemic correlations only. Here there are some of them that called our attention:

  • Gender inequality playing a strong role and inverse correlation in Human Development Index while we are living a transition of the industrial age to information where knowledge if surpassing the physical differences between genders.
  • Research and development having a direct correlation to HDI.
  • The United States having its own scenario due to its unique systemic characteristics.
  • Gross National Income GNI per capita leading the ranking and the values around 40 thousand dollars.
  • Public expenditure ahead of Education related indexes.

Business applications

Applying the same questions we had at the beginning of the article, now let’s see how they would look like for different business scenarios:

Sales

  • How many scenarios exist for your sales? Which customer segment belong to each scenario?
  • Amongst several business factors, which of them have the most influence to define a High or Low revenue?
  • What is the DNA (characteristics) of a High and Low revenue scenario?

Industry

  • How many production/maintenance scenarios exist for your production line? Which processes belong to each scenario?
  • Amongst several production factors, which of them have the most influence to define a High or Low outcome or High or Low maintenance/costs?
  • What is the DNA (characteristics) of a High and Low production/maintenance scenario?

Healthcare

  • How many patient scenarios exist for a specific disease or medical condition? Which patients belong to each scenario?
  • Amongst several patient characteristics, which of them have the most influence to result in High or Low levels of a specific disease or medical condition?
  • What is the DNA (characteristics) of a High and Low medical condition scenarios?

All in all, we expect that this article can help easy landing on the newest territories of machine learning and in case you need more information on how this solution applies to your business scenario, please let us know. If you found this analytics interesting and worth spreading, do so. Super thanks on behalf of Aquarelas team!

What is Aquarela Advanced Analytics?

Aquarela Analytics is Brazilian pioneering company and reference in the application of Artificial Intelligence in industry and large companies. With the Vortx platform and DCIM methodology, it serves important global customers such as Embraer (aerospace), Randon Group (automotive), Solar Br Coca-Cola (food), Hospital das Clínicas (health), NTS- Brazil (oil and gas), Votorantim (energy), among others.

Stay tuned following Aquarela’s Linkedin!

Send this to a friend