Cost and Pricing Models in Food and Beverage Processing: Seeing the Forest through the Trees

Lost sight of your bottom line? Product pricing not in step with your actual costs? So busy trying to stay on top of cost behavior that you can’t keep your costs under control? Cost and pricing models, as part of an enterprise resource planning (ERP) system, can help you better understand your costs. Put corrective measures into motion before problems occur—and save your food and beverage enterprise time and money.

Related Topics: Enterprise Resource Planning (ERP), Lean/Flow Manufacturing, Project Estimating, Decision Making, Total Cost Analysis (TCO)

Related Industries: Food Manufacturing, Beverage and Tobacco Product Manufacturing

Related Keywords: Orion Enterprise, ABC, 3i Infotech Limited, manufacturing solution, enterprise resource planning, cost and pricing models, cost behavior, ERP, food and beverage processing, activity-based costing

ERP System Benefits on the Balance Sheet

Benefits from improved business processes and improved information provided by an ERP system can directly affect the balance sheet of a manufacturer. To illustrate this impact, a simplified balance sheet is shown in figure 3.1 for a typical manufacturer with annual revenue of $10 million. The biggest impacts will be on inventory and accounts receivable.

In the example, the company has $3 million in inventory and $2 million in outstanding accounts receivable. Based on prior research concerning industry averages for improvements, implementation of an ERP system can lead to a 20 percent inventory reduction and an 18 percent receivables reduction.

Figure 3.1 Summarized balance sheet for a typical $10 million firm

Typical
Current Improvement Benefit
Current assets

Cash and other


500,000


Accounts receivable


2,000,000

18%

356,200

Inventory


3,000,000

20%

600,000

Fixed assets

3,000,000


Total assets


$8,500,000

$956,200
Current liabilities
xxx,xxx

Non current liabilities
xxx,xxx

Stockholder's equity
xxx,xxx

Total liabilities and equity
xxx,xxx


* Inventory Reduction. A 20 percent inventory reduction results in $600,000 less inventory. Improved purchasing practices (that result in reduced material costs) could lower this number even more.

* Accounts Receivable. Current accounts receivable represent seventy-three days of outstanding receivables. An 18 percent reduction (to sixty days' receivables) results in $356,200 of additional cash available for other uses.

Justification of ERP Investments Part 1: Quantifiable Benefits from an ERP System

Studies that surveyed manufacturers about the impact of ERP systems on firm performance indicate that company size and industry do not affect the results. Benefits have been indicated for large and small firms, whether they make standard or custom products or are in discrete or process manufacturing environments. This section explains the quantifiable benefits in terms of several areas of improvement.


Typical Benefits

The most significant quantifiable benefits involve reductions in inventory, material costs, and labor and overhead costs, as well as improvements in customer service and sales.

Inventory reduction. Improved planning and scheduling practices typically lead to inventory reductions of 20 percent or better. This provides not only a one time reduction in assets (and inventory typically constitutes a large proportion of assets), but also provides ongoing savings of the inventory carrying costs. The cost of carrying inventory includes not only interest but also the costs of warehousing, handling, obsolescence, insurance, taxes, damage, and shrinkage. With interest rates of 10 percent, the carrying costs can be 25 percent to 30 percent.

ERP systems lead to lower inventories because manufacturers can make and buy only what is needed. Demands rather than demand insensitive order points drive time phased plans. Deliveries can be coordinated to actual need dates; orders for unneeded material can be postponed or canceled. The bills of material ensure matched sets are obtained rather than too much of one component and not enough of another. Planned changes in the bills also prevent inventory build up of obsolete materials. With fewer part shortages and realistic schedules, manufacturing orders can be processed to completion faster and work-in-process inventories can be reduced. Implementation of JIT philosophies can further reduce manufacturing lead times and the corresponding inventories.

Material cost reductions. Improved procurement practices lead to better vendor negotiations for prices, typically resulting in cost reductions of 5 percent or better. Valid schedules permit purchasing people to focus on vendor negotiations and quality improvement rather than on expediting shortages and getting material at premium prices. ERP systems provide negotiation information, such as projected material requirements by commodity group and vendor performance statistics. Giving suppliers better visibility of future requirements helps them achieve efficiencies that can be passed on as lower material costs.

Labor cost reductions. Improved manufacturing practices lead to fewer shortages and interruptions, and less rework and overtime. Typical labor savings from successful ERP are a 10 percent reduction in direct and indirect labor costs. By minimizing rush jobs and parts shortages, less time is needed for expediting, material handling, extra setups, disruptions, and tracking split lots or jobs that have been set aside. Production supervisors have better visibility of required work and can adjust capacity or loads to meet schedules. Supervisors have more time for managing, directing and training people. Production personnel have more time to develop better methods and improve quality and throughput.

Improved customer service and sales. Improved coordination of sales and production leads to better customer service and increased sales. Improvements in managing customer contacts, in making and meeting delivery promises, and in shorter order to ship lead times, lead to higher customer satisfaction and repeat orders. Sales people can focus on selling instead of verifying or apologizing for late deliveries. In custom product environments, configurations can be quickly identified and priced, often by sales personnel or even the customer rather than technical staff. Taken together, these improvements in customer service can lead to fewer lost sales and actual increases in sales, typically 10 percent or more.

ERP systems also provide the ability to react to changes in demand and diagnose delivery problems. Corrective actions can be taken early, such as determining shipment priorities, notifying customers of changes to promised delivery dates, or altering production schedules to satisfy demand.

Improved accounting controls. Improved collection procedures can reduce the number of days of outstanding receivables, thereby providing additional available cash. Underlying these improvements are fast accurate invoice creation directly from shipment transactions, timely customer statements, and follow through on delinquent accounts. Credit checking during order entry and improved handling of customer inquiries further reduces the number of problem accounts. Improved credit management and receivables practices typically reduce the days of outstanding receivables by 18 percent or better.

Trade credit can also be maximized by taking advantage of supplier discounts and cash planning, and paying only those invoices with matching receipts. This can lead to lower requirements for cash-on-hand.

How to Define Your Business and Technical Requirements

Typical enterprise application selections begin with little mention of technology, since the first consideration is modeling the desired business processes that the new technology will enable, and then matching them to the functional requirements within any given software solution. TEC uses a standardized methodology to model and match these processes. The following steps are critical to ensuring overall success within this phase.

Step 1: Form a Cross-functional Project Team

A cross-functional team ensures that both the business and technical needs of your organization are addressed, and that each group affected by the changes understands the impact of the decision. The ideal team consists of members of the following groups: management; finance or business operations; users; consultants; and members of the IT operations and infrastructure groups.

Champions and subject matter experts (SMEs) should be chosen from each business area to work with the project team. This will ensure complete buy-in from the business side and help promote the new solution within the rest of the organization, as well as provide expert knowledge within the project team on existing processes and day-to-day operations.

Step 2: Model Business Processes Hierarchy through an Internal Needs Assessment

The project team, with the help of the champions and the SMEs, is responsible for defining and modeling business processes. The first goal is to determine the main process groups, which correspond to the individual business areas of the organization.

Within these groups, processes correspond to the high-level divisions of your business areas (see figures 1 and 2 below). Within these processes, subprocesses detail the main departments of the high-level divisions. Subprocesses include the day-to-day tasks within each department. For each activity, there may be business-based rules describing how these day-to-day tasks are to be performed and controlled.

This large volume of data is difficult to track, organize, and manipulate using traditional methods, such as spreadsheets, word documents, and flow charts. But if this critical information is not properly stored, organized, or made easily accessible, it can cause huge time delays—which in turn can substantially increase the cost of the software selection project.


(Click here for larger version)
Figure 1: Process group chart


(Click here for larger version)
Figure 2: Drilling down from business areas to business rules

Your Guide to Enterprise Software Selection: Part One

IT acquisition and purchasing decisions are often conducted in an atmosphere of unmet expectations, internal political agendas, vendor promises, and brand name hype. Decisions are driven by executive mandate, rule-of-thumb, or insufficient analyses based on rudimentary spreadsheet comparisons.

This is a sure recipe for failure, as demonstrated by the horror stories published continually in trade magazines and the press. We'll describe a best-practice approach to the assessment, evaluation, and selection of software—and show you how you can reduce the time and cost involved in objectively choosing the right solution.

There are three main phases within Technology Evaluation Centers' (TEC's) software assessment, evaluation, and selection methodology:

Phase 1: Defining Business and Technical Requirements
Phase 2: Software Evaluation and Analysis
Phase 3: Negotiation and Final Selection

Overview

Phase 1
TEC's methodology establishes the foundation for the ultimate success of the selection project. Successful evaluation and analysis of a system—and negotiation with a vendor—are irrelevant if the initial definition of business and technical requirements are incomplete or inaccurate. In many software selection projects, there is not enough emphasis on the importance of this phase, which causes many failures, and can even result in disaster for companies during and after implementation.

TEC's decision support system facilitates fast and accurate compilation of business processes, and maps them to the features and functions of a software solution. By closely following the steps outlined within this phase, an organization can produce a complete and understandable specification of all the needs that are to be addressed by the new solution, and is able to keep the assembled data in one easily accessible repository.

Phase 2
The evaluation and analysis of vendor solutions should proceed from finding the right vendors through to selecting a shortlist of two or three finalists. The sheer mass of data collected during this phase can be overwhelming for any organization, and the manipulation of the data even more daunting.

There may be as many as 20 or 30 qualified vendors, and each may have a list of thousands of criteria, all of which have to be evaluated one against the other. Using traditional methods can lead to serious errors—and may lead to choosing the wrong vendor solution. We'll show you how TEC's decision support system alleviates this process and seriously reduces the time required to reach a more informed and accurate choice of the right vendors to include in the shortlist.

Phase 3
The final phase covers the steps within the negotiation and the final selection process with the short-listed vendors. This includes live vendor demonstrations at the client site, where each solution can be rated by the business and selection team to verify ease-of-use, coverage of critical business processes, and functionality.

During this phase, we suggest that your selection team seek out client references from each vendor to verify their implementation, service, support, and training experiences. We'll explain how TEC's decision support system facilitates and shortens this process by loading vendor information into TEC's comparison tool to produce reports and graphs, which will support your selection team's final recommendations.

Comparing Business Intelligence and Traditional ETL

After evaluating the core components of data integration, the organization should investigate its traditional BI needs throughout the organization, and assess how they will evolve or change.

Until recently, ETL involved uploading data at regular (i.e., monthly or weekly) time intervals to drive business performance decisions and identify business opportunities. However, as BI tools become more integrated with overall business functions, including business performance management (BPM) and reporting and analysis requirements, data needs have shifted from monthly or weekly intervals to real time updates. This means that it has become more important for data transfers to accurately reflect real time business transactions, and that there has been an increase in the amount of data transfers required.

Nonetheless, real time ETL doesn't necessarily refer to automatic data transfer as operational databases are updated. In terms of BI, real time may mean different things to different organizations or even different departments within these organizations. Take, for instance, an automotive manufacturer whose traditional data warehouse solutions (OLAP cubes, etc.) involved capturing data at a given point in time. The automotive manufacturer might, for example, have wanted to track and compare monthly sales with last year's sales during the same month by region, car model, and dealer size, thus requiring the data warehouse to be updated on a monthly basis. However, as the manufacturer's business decisions evolved based on this analysis, its data needs shifted from a monthly requirement to a weekly one, and on to an ever more frequent basis, eventually creating the demand for real time data. In the case of the automotive manufacturer, real time data may be useful for identifying the movement of car parts within a warehouse relative to their storage locations and comparing this information with the demand for these parts.

Such a shift in data requirements affects both the volume of data required and when the data loading occurs. The end result is that, in order to meet the changing needs of user organizations, ETL and BI vendors have concentrated on moving towards real time ETL and shifting their data loading functionality to accommodate higher volumes of data transfer.

Comparing Business Intelligence and Data Integration Best-of-breed Vendors' Extract Transform and Load Solutions

To understand the relevance of extract transform and load (ETL) components and how they fit into business intelligence (BI), one should first appreciate what data integration is and the significance of having clean, accurate data that enable successful business decisions. Within the BI industry, data integration is essential. By capturing the right information, organizations are able to perform analyses, create reports, and develop strategies that help them to not only survive, but, more importantly, to thrive.

Informatica, a leading provider of enterprise data integration software, defines data integration as "the process of combining two or more data sets together for sharing and analysis, in order to support information management inside a business". In BI terms, this means that data is extracted in its original form and stored in an interim location, where it is transformed into the format that will be used in the data warehouse. The transformation process includes validating data (e.g., filling in null zip code information in the customer database) and reformatting data fields (e.g., separating Last Name and First Name fields of customer records that are merged in one database but not others). The next step is to load the data into the data warehouse. The data is then used to create queries and data analysis builds, such as on-line analytical processing (OLAP) cubes and scorecard analyses. In a sense, extracting the proper data, transforming it by cleansing and merging records, and loading it into the target database is what allows BI solutions to build analytical tools successfully. It is also the essence of ETL functionality.

Data Integration Components

In order to determine the most suitable ETL solution for them, organizations should evaluate their needs in terms of the core components of the data integration process, as listed below.

* Data Identification. What data does the organization need to extract and where does it come from? What end result, in terms of the data, does the organization want to analyze? Essentially, answering these questions means identifying the origin of the data, and what the relationship is between the different data sources.

* Data Extraction. How frequently does the organization require the data? Is it monthly, weekly, daily, or hourly? Where should data storing and transformation activities occur (i.e., on a dedicated server or in the data warehouse, etc.)? Considering these factors identifies the data frequency needs of the organization. For example, analysis of sales data may require the organization to load data monthly or quarterly, whereas some other data transfers may be performed multiple times a day. In determining the frequency of the data loading and transformation in the data warehouse or on the dedicated server, the organization should also consider the amount of data to be transferred and its effect on product performance.

* Data Standardization. What is the format of the organization's data, and is it currently compatible with the same data elements in other systems? For example, if the organization wants to analyze customer information and to merge customer buying patterns with customer service data, it must know if the customer is identified in the same way in both places (e.g., by customer identification [ID], phone number, or first and last name). This is crucial for ensuring that the correct data is merged and that the data is attached to the right customer throughout the data standardization process. Another data standardization issue the organization should deal with is identifying how it will manage data cleansing and data integrity functions within the data warehouse over time.

* Data Transformation. The organization should consider data transformation requirements and the interaction between the transformed data components. The critical questions are how will the data be reflected in the new database, and how will that data be merged on a row by row basis? Answering these questions involves identifying the business and data rules associated with the data to ensure accuracy in data loads.

* Data Loading. Where will the data be loaded? What data monitoring activities are required? Other data loading concerns are failed data transfer identification, how failed transfers are handled, and how updates occur. For example, will each load involve re-loading the whole dataset, or will updates be made using only updated fields within the data sources?