• Skip to main content
  • Skip to footer

Steve Beeler

You have a goal…I have a way to get you there.

  • About Me
  • Project Management
  • Operations Engineering
  • Motorsports
  • FF50th
  • Blog

Operations Engineering

Lake Michigan Water Levels

October 8, 2022 by stevebeeler

Lake Michigan shore line

Record high Lake Michigan water levels and dune erosion have been big concerns along the “north coast” in recent years.  As beaches disappeared under the rising water, homes near the shoreline were threatened.

Barge with crane on Lake Michigan

Property owners resorted to armoring the coastline with huge boulders to dissipate wave energy.  In many cases, cranes on barges were required to place the boulders.

I finally got around to doing a statistical analysis on Lake Michigan water levels using data from the United States Army Corps of Engineers.  For years, control charts have been in my operations engineering toolkit.  Why not apply statistical process control to a natural phenomenon?

An X-bar & R chart is commonly used to measure the magnitude of common cause (random) variation and to assess statistical stability (constant probabilities).  For a sample plan, I used one data point per year in subgroups of five.  That is, each data point on the X-bar chart is an average of five years and each data point on the range chart is the difference between the highest and lowest value in that five-year period.  With 103 years of data (1918 to 1921), this sample plan provided twenty data points, the minimum required to calculate control limits.

Control chart of Lake Michigan water levels

The range chart (bottom) is in control.  All points are within the control limits and there are no trends.  This means the magnitude of year-to-year water level variation has been constant over the last 100+ years.

On the other hand, the X-bar chart (top) does not exhibit statistical control.  While it is mean centered with no trends, there are multiple points (4 above and 6 below) outside the control limits.  This suggests that the five-year sample plan has underestimated the range.  In other words, the within subgroup variation did not entirely capture the common cause (that is, random) variation in Lake Michigan water levels.

In retrospect, this is not surprising.  While 103 years seems like such a long time, it is not even an instant from a geologist’s perspective.  A sample plan with a longer duration appears to be called for but then there would not be enough subgroups to calculate control limits.

Given that the range chart is stable, I suspect that the X-bar chart (i.e., average water levels) would be too if we had data sampled over a sufficiently long period of time.

Others are interested in the Great Lakes.  Enough, apparently, to warrant a scientific publication, the Journal of Great Lakes Research.  In it, I found an abstract that supports my hypothesis of stable Lake Michigan water levels: “Historical Variation of Water Levels in Lakes Erie and Michigan-Huron” by Craig T. Bishop.

From this abstract I learned that the earliest “reliable” Great Lakes water level data were recorded in 1819.  From historical and archeological evidence, Bishop concludes that “… over at least the last 1,800 years, climate-related variations in maximum mean annual water levels have probably not exceeded those measured on Lakes Erie and Michigan-Huron since A.D. 1819.”

That sounds like Lake Michigan water levels have been stable for a long time.  In 2022, the water is down from record levels and beaches are reappearing.  The spectacular sunsets never left.

Lake Michigan sunset

If you would like to perform your own statistical analysis on Great Lakes water levels, click HERE for a link to the United States Army Corps of Engineers data set.

Filed Under: Operations Engineering Tagged With: Control Charts, dune erosion, Lake Michigan, Statistical Process Control, Statistical Sampling, X-bar & R Chart

A Deeper Labor Pool

September 21, 2022 by stevebeeler

A Deep Labor Pool

In today’s post COVID economy, labor is in short supply. Here’s an old-school manufacturing best practice that can be used to develop a deeper labor pool.

Way back in the day, I had a production management position at Louisville Assembly Plant during the initial Explorer launch. The launch was hugely successful. Production quickly ramped up to 87 trucks an hour (that’s one off the line every 41 seconds!) and build quality was excellent…off-line repair bays were mostly empty.

A big key to that success was labor versatility. Having trained operators on every job every day just didn’t happen. Training to develop a deeper labor pool was a priority.

A simple tool was used to manage training: the versatility chart. Each production supervisor had one for his or her zone.  Down the left side of the chart were all the employees and across the top were all the jobs. If an employee was trained on a job, a “1” was entered in that cell. Across the bottom of the chart were the total number of employees who were trained on that job.

Conceptual example of a versatility chart

Ideally, each job was three deep. That is, there were three trained employees for every position on the line.  Versatility gaps became cross-training priorities.

The more jobs your people know, the deeper your labor pool. This simple concept applies outside of manufacturing.

In baseball, rosters are limited by rule. Managers need flexibility to give stars a day off and to make strategic moves in the late innings of close games. Players who can play multiple positions are essential for a deep bench.

In business, people are limited by budgets and, more recently, by labor constraints. Managers need flexibility for any number of reasons: vacations, illnesses, seasonality, etc. Cross-training develops flexibility without adding staff.

So whatever your situation, consider cross-training to develop a deeper labor pool.

Thirty years later, I still use best practices from the Explorer launch in my day job as a Professional Engineer.  Click HERE to visit my Operations Engineering page.

 

Filed Under: Operations Engineering Tagged With: Constraint, Manufacturing, operational excellence, Theory of Constraints

Automation with Labor Constraints

April 8, 2022 by stevebeeler

Automation with Labor Constraints

The financial justification for automation with labor constraints now has an additional component: contribution margin.  Historically, automation investments have been primarily justified by reducing people.  Retiring baby boomers and COVID-19 have created labor shortages.  Here’s the new math:

Automation is a broad term for any technology that reduces human input.  While the terminology may be relatively new (believed to be circa 1940’s from the auto industry), the concept is hundreds of years old.  Water powered spinning mills from the late 1700’s are early examples.  In today’s world, automation is everywhere from the robots that weld cars to the ERP systems that manage supply chains.

Labor savings have been the primary driver for automation.  Other benefits can also be considered: safety, quality, scrap, energy, and more recently, flexibility.  Return on investment (ROI) is determined by dividing total savings by the cost of the project.  If the ROI is sufficient, the automation investment can move forward.

But what if you can’t find enough people?

In a client’s foundry, I noticed robots are now de-flashing large castings.  This is hot, dirty, nasty work that few want any part of.  There was significant turnover and the operation was always undermanned.  While labor savings alone could not justify the automation, lost contribution margin from production shortfalls more than made up the difference.

This suggests a new ROI math for automation with labor constraints.

Add lost contribution margin to the numerator in the ROI calculation.  How much money are you not making because you can’t find enough people?  If labor is limiting output, then lost contribution margin is a quantifiable benefit of an automation investment.

This new ROI math is not just for manufacturing.  It also applies to business processes like the sales funnel.  That CRM module may not save many heads today, but will it allow the company to grow without adding hard to find sales professionals tomorrow?

A word of caution.  The new automation ROI math is somewhat subjective and, therefore, the possibility of mischief exists.  Labor must truly be a long-term constraint limiting output.  Other options to attract and retain employees (wages, benefits, working conditions) must be considered.

An understanding of constraints is essential.  Goldratt’s The Goal is the definitive primer on systemic thinking and Theory of Constraints.  This new math is a logical extension of the Goldratt 5-Step throughput improvement model when the availability of labor is the constraint.

Consider hard automation as first step.  Simple examples of hard automation are all around us.  In manufacturing, these include tables with multiple drilling fixtures and conveyors or slides between operations.  That Excel macro generating the monthly sales report is another example of hard automation.

So that’s the new math.  Adding lost contribution margin to your ROI calculation is the key to finding the best automation projects and growing your business in a labor constrained world.

 

Filed Under: Operations Engineering Tagged With: automation, labor constraints, operational excellence, Theory of Constraints

A Plan for Every Part

March 8, 2021 by stevebeeler

A manufacturing marketplace organized through A Plan for Every PartA Plan for Every Part drives waste out of inventory and warehousing operations.  It is the foundation for the continuous improvement of your procurement and material handling activities.  Here’s how to get started:

A Plan for Every Part is exactly as named: a compilation of facts and figures about all of your part numbers.  While there is specialized software for this purpose, an Excel spreadsheet works fine, too, in many situations.

Typical dimensions include:

  • Part number
  • Part description
  • Supplier
  • Annual usage
  • Supplier
  • Container type
  • Container size (length x width x height)
  • Part Weight
  • Container capacity
  • Storage method
  • Location
  • Transport method

Compiling all of this data is messy and people intensive.  Designing a data collection template for each part will increase accuracy, standardize units of measure, and generally speed things along.  A change process will be needed to maintain the integrity of the data.

As this database takes shape, opportunities to reduce complexity (and subsequent waste) in containers, racks, and material handling equipment will appear.  There are great benefits in standardization!  Defined locations improve inventory control and reduce if not eliminate the time wasted looking for parts.  An overall reduction in inventory can also be expected through less overproduction and increased inventory turns.

Set up length, width, and height as separate fields so that they can be sorted separately.  Ask me how I know this.  🙂

There may be a temptation to limit A Plant for Every Part to the highest usage or most expensive parts.  Don’t go there.  Any part, even a small bolt, can halt production if it is missing when needed.

Thinking about warehouse automation?  A Plan for Every Part is a necessary prerequisite.

Market Place Design Checklist incorporating A Plan For Every Part

On my capacity expansion project, we are combining A Plan for Every Part with this material handling checklist to design and size the new plant’s marketplaces.  Not only will the marketplaces be better both operationally and financially today, but we are building a bridge to automation opportunities tomorrow.

 

Filed Under: Operations Engineering Tagged With: A Plan For Every Part, Continuous Improvement, Lean Thinking, operational excellence

Supply Chain Risk

May 4, 2020 by stevebeeler

Supply Chain Risk

Supply chain risk is readily evident with COVID-19 related plant shutdowns across the country and across the globe. Outsourcing and off-shoring have increased the length of supply chains. Lean Manufacturing has reduced inventories. Just-In-Time / Just-In-Sequence deliveries leave little time for the unexpected.  Here is a simple method to assess supply chain risk.

The method is based on the work of Professor David Simichi-Levi of MIT’s Sloan School of Business. While his risk exposure model was developed in the context of global manufacturers with complex networks of suppliers, the concepts are applicable to domestic manufacturers with a single tier of suppliers.

The supply chain risk analysis starts with three basic questions:

(1) How many days will it take a supplier to re-fill the supply chain after a disruption? In the Simichi-Levi model, this factor is Time to Recover (TTR). TTR is driven by the length of the supply chain and on availability of alternative sources.

(2) How many days can production continue without deliveries from that supplier? This is Time to Survive (TTS). TTS is strongly dependent on inventory.

(3) What are the operational and financial costs per day of while waiting for deliveries from that supplier? This is Performance Impact (PI).

The key to the model is the difference between Time to Recover and Time to Survive. If TTR is less than TTS, the supply chain will be re-filled without losing production. If TTR is greater than TTS, then the supply chain will run dry and production will be lost. If TTR and TTS are approximately equal, a supplier disruption can be managed through expediting.

This bubble chart is a visualization of supply chain risk, with the size of the bubble proportional to supplier spend.

Supply Chain Risk Bubble Chart

There is no risk of a production loss due to a disruption at Supplier C. Even with the largest spend, Supplier A is not at risk for a production loss. Through expediting, the risk of a Supplier E can be mitigated. Supplier B is low risk but with the greatest financial pain. The greatest risk for a production loss is Supplier D, even with the lowest spend.

No one knows how and for how long COVID-19 will be disrupting global and domestic supply chains. Regional peaks and multiple waves could lead to repeated closures and re-openings at key suppliers. This supply chain risk analysis can, at least qualitatively, identify your highest risk suppliers. Actions taken now can reduce risk and minimize the effects on operational and financial performance.

Professor Simichi-Levi has written extensively on supply chains and operations. For more on these topics, his most recent book “Operations Rules: Delivering Customer Value through Flexible Operations” is available through Amazon.

To adapt your business to today’s new operational challenges, click HERE for a post COVID-19 toolkit rolled up from six of my blog posts.

Filed Under: Operations Engineering Tagged With: Just-In-Sequence, Just-In-Time, Lean Thinking, operational excellence, supply chain

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Interim pages omitted …
  • Go to page 7
  • Go to Next Page »

Footer

Find Success.

We can reach your goal. Contact me to start things off.

Get in Touch

  • LinkedIn
  • YouTube
  • About Me
  • Project Management
  • Operations Engineering
  • Motorsports
  • FF50th
  • Blog

Copyright © 2023 Steve Beeler · All Rights Reserved · Privacy Policy