Internet of Things, Big Data And How They Affect Your ERP
Wednesday, October 18th, 2017 | 1642 Views
Enterprise resource planning (ERP) systems have long been used by the food industry, but with advances in technology, the functionality they provide food businesses are gradually changing. Richard Haugen, VP, Value Engineering, SAP, addresses this and more.
Adapting To Change
This is all about to change. Enterprises of all sizes are finding themselves struggling with the explosive growth of data and supply chain complexity from multiple sources. They are also realising that they need to deal with more data at a faster rate than anyone would have guessed even two to three years ago.
There is no single driver behind this change. The change in the food industry is being driven by a confluence of new technologies, and emerging technologies, and the rapid growth of connectivity. Add to this the increasing requirements for ingredient level traceability, distribution channels that didn’t even exist a few years ago, and the push for shorter replenishment times—these are all needs of today that the older traditional systems simply cannot keep up with.
Imagine the following scenario: a food manufacturer is notified by a his vendor that some bulk spice delivered two weeks ago needs to be recalled. The Quality Assurance manager looks into their ERP system and identifies the effected batch/lot number of the flour and where it was issued to production.
To her horror, the identified batch had all been used and shipped out as finished product. After a rushed meeting with the corporate managers and marketing department, the manufacturer issues a recall to their customers and readies a public announcement.
They communicate this news to all affected distributors, retailers and partners to whom they have distributed the product. With investments in IoT systems, they can notify their IoT partner who then pushes the notification out to all connected machines.
Shortly thereafter, a consumer with an IoT-enabled refrigerator walks by and notices an alert on her screen. The refrigerator had scanned the barcode of the product when she put it away after shopping. She is now notified that the product is on recall, advised not use it, and provided with instructions on how to return the product. The consumer is also asked if she would like to receive a coupon to thank her for her inconvenience. She clicks yes and a coupon is loaded into her smartphone.
Let’s take another scenario. A food manufacturer works with an IoT partner to monitor which of their products are typically put into and taken out of a refrigerator within one minute of another product. They overlay that data with the day of week, time of day, weather, and geo-demographic information. They also look at the other products and see how they could be used at the same time. The marketing department spots trends that they could not have picked up with simple in-store register data.
Making Planning A Reality
How realistic are these scenarios? The data is all there being collected at an ever-increasing number of points along the product life cycle. The challenge for most companies is that although the data is ‘all there’, it is not actually present in a form that can be realistically accessed and utilised for planning, marketing and supply chain management. So, where is the bottleneck preventing the use of this valuable data?
The answer is volume of data. Traditional ERP system architecture based on outdated technology will simply choke on the volumes of data that IoT can deliver. Compound this by increased multiple distribution channels (e.g. wholesale, broker, direct to consumer e-commerce, third party e-commerce and granularity of data required for ingredient tracking), and most systems being used in the food industry will begin to drown in the volume of data being collected and eventually become unusable.
This has forced IT managers to typically leave the data in stand-alone data silos of manageable sizes that then need to be integrated and consolidated into reporting cubes before anyone can even start to use the data to build consolidated information on which decisions can be made.
What can we do? Continuing to pay for servers, expanded database storage, expensive integrations, and custom reporting platforms—all these lead to increased expenses and points of failure, and is not something most executives are interested in pursuing. Simply putting your servers in a hosted cloud environment doesn’t help much as the traditional software architecture and data structures were all written with a presumption of a spinning hard disk environment.
Another contributing factor is the long lead time between development releases of traditional systems. It is not unusual for major releases for on-premise software installations to be on an 18-24 month cycle. This translates into systems that are not able to respond to the dynamically changing requirements in a timely fashion and compounding the problems with potential expensive system customisation, which risks revision locking your system and keeps you from upgrading in the future.
There is an alternative: cloud-based ERP is typically developed on a much more rapid time scale—sometimes on a quarterly release cycle. This enables customers to leverage new technologies and innovations to meet their ever changing challenges.
The answer to the data volume problem is in-memory databases. In-memory platforms are designed to hold big data. When delivered via the cloud, they provide the economically sound scalability, and performance that traditional systems just cannot match.
Some vendors such as SAP and their HANA platform are looking to provide a single technology platform which can hold such massive amounts of data and sit on a single technology platform as their ERP. This is a revolutionary approach as it both enables data visualisation in ways never thought possible, and at the same time simplifies the architecture and lowers the total overall cost of ownership.
When ERP is built upon in-memory database, they are not contained by the physical limits of having to read and write data to and from a hard disk. These systems are designed to handle the ever increasing data volumes of the growing IoT world. A little-known fact is that software written for in-memory databases typically requires as much as 30 percent less code than disk based software.
Cloud-based multi-tenant ERP systems based in-memory technology liberates manufacturers from the constraints of traditional on-premise systems. Multi-tenancy allows the ERP vendor to efficiently and dynamically allocate CPU and memory to maintain system performance levels necessary to maintain a stellar user experience despite this increasing volume of data involved.
The net result is that in-memory cloud based systems can handle the volume of data generated by IoT, ERP and marketing in real time to enable businesses to leverage the data in innovative ways. With real-time access to big data, companies will be better equipped to manage their supply chain, margins and marketing information, enabling them respond to the dynamically changing marketplace and grow their business.
SHARE WITH FRIENDS: