What Came First, The New Application or The Follow-Up?

As an insurance company leader, do you ever
wonder if your new business department spends a disproportionate amount of time
doing follow-ups on existing customer applications rather than focusing on processing
new ones? 

This imbalance is a common pain point Trindent Consulting has observed in their engagements with large insurance firms.  Client data shows us there is a clear-cut area of opportunity in this imbalance, as new business departments are spending excessive amounts of time handling follow-ups – considered to be non-value work – instead of processing new applications.

Using the 2nd Lean principle (Mapping
the Value Stream), an insurance company leader should be asking three questions
to determine what the optimal amount of time is to spend on a non-value
activity like follow-ups:

  • “Would the customer pay for
    this activity?”
  • “Would more of this activity be
    good for the firm?”
  • “Does this activity help move a
    process along?”

But because the answers are not black and
white, insurance companies often allow themselves to be lenient towards non-value
activities they see as necessary, such as follow-ups. 

What’s the Issue with Follow-Ups?

In a perfect consulting world, follows-ups wouldn’t
exist because they are essentially re-work, the embodiment of inefficiency.   Every new application would be submitted only
after it was completed correctly and in its entirety, and could be reviewed,
underwritten and sent back to the right agent with a rating and price in a
single cycle.   

In reality however, follow-ups are an unavoidable part of the insurance application process and will never be eliminated because human error on the part of applicants cannot be controlled.   Follow-ups occur when additional information, or corrections to existing information, is needed to complete the application process.  In the insurance industry, it’s an acceptable practice to go through one or two, or sometimes even three, follow-ups for a single application.  However, when the follow-up count reaches four, it becomes a problem.  The repeated follow-ups cause applications to repeatedly cycle through and clog up the system and create an imbalance in the volume of follow-ups versus new application processing.

What Can You Do?

In order to solve this imbalance, it’s
important to set proper parameters to reduce the rate at which follow-ups occur.  This requires a deep dive into call center data
in order to understand why follows-up occur, and to determine what can be done
to reduce them.  

Through questions like these, call centers can begin to understand where to focus their process improvement efforts:

  • Which form or section of the
    form causes the most problems?  Is the
    problematic form or section of the form confusing?  Can it be easily changed?
  • Is there a specific agent who
    is a repeat offender?  Can guidance or
    incentives be provided to change their behaviour?
  • Are we internally too stringent
    on some of the forms?   Are we letting
    the perfect get in the way of the good?
  • Are staff adequately trained to
    know what really requires a follow-up and what doesn’t?

The answers to these questions can become a
guide for call center leaders to restore balance to value-add versus non
value-add work, and to make their teams more productive. 

Follow-ups are only one of many
inefficiencies that insurance companies face. 
Trindent’s methodology for call center improvements is a tested and
proven systematic approach to extract, analyze and implement solutions to
processes, systems and behaviors to drive sustainable results.   

Click here to learn more about how Trindent can develop a solution to drive your call centers’ efficiency.


Hydrocarbon Loss: Identifying Opportunities

In
refineries, an Operating Expense hides in plain sight – Hydrocarbon Loss.

Hydrocarbon
Loss occurs daily in refinery operations because improper systems, processes,
and behaviours are in place.  This is a preventable
loss that often costs individual refineries millions of dollars per year.  Best-in-class targets are under 0.25% loss,
while under 0.5% loss is considered an achievable target for the average
refinery. If your refinery is performing worse than these targets or, worse, is
operating in the dark, don’t worry, a solution exists – a Hydrocarbon Loss
Control Program.

A
Hydrocarbon Loss Control Program is a set of measures that can be implemented
to systematically reduce refinery Hydrocarbon Loss.  An effective Loss Control Program will
identify all refinery fence line inputs and outputs to identify known losses
and potential causes of unknown losses. Using a structured approach, the Loss
Control Program can then become more granular and identify process unit inputs,
outputs, and sources of loss.

To
have a successful Loss Control Program that can identify opportunities, it is
important to understand the root causes of Hydrocarbon Loss.  It often occurs at custody transfer points -
the measurement points, where possession (custody) of hydrocarbons changes
hands. Typically, these are interface points between the refinery and
pipelines, rail cars, trucks, and vessels. At these custody transfer points
there are three common causes of loss:

  1. Sediment
    and Water Measurement Losses

    - Crude is necessary for the refinery’s operations, but the sediment and water
    intrusions that often exist in crude are not. Often, the refinery is paying for
    these intrusions without realizing the magnitude of these intrusions or of the
    associated cost.
  2. Inaccurate
    Measurements

    Measurements are used to track the volume and/or mass of hydrocarbons
    throughout the value chain, inaccurate measurements result in false data and
    prevent data-driven action and solutions. At point-of-sale custody transfer
    points, incorrect measurements can result in financial losses, or penalties if
    the quantity or quality is different than contracted to the buyer.
  3. Retains
    When offloading
    rail cars, trucks, and barges not all the product is extractable or the
    opportunity cost of extraction is greater than the value of extraction. The
    amount that remains onboard is often paid for, but not accounted for in mass
    balance programs.

To
prevent these losses, Trindent’s team of Hydrocarbon Loss Consultants can implement
an effective and efficient Loss Control Program at your refinery and help you
on your journey to best-in-class targets. To support Loss Control Program
implementation, we have customizable custody transfer tools that are ready to
be tailored to your site and we have training material available to develop and
empower your employees to ensure results are sustained.

To learn more about how Trindent can make it happen in your company, reach out to our team on LinkedIn or through our Contact Us page.

The author of this blog, James Greey is a senior consultant at Trindent.


Understanding Gasoline Quality Giveaway

At
a refinery, gasoline is produced to regulatory or customer demanded
specifications, and exceeding these specifications is considered gasoline quality giveaway, costing individual
refineries millions of dollars a year.

What
is gasoline quality giveaway?  Let’s say a gas station selling premium
gasoline as 91 octane – this is a specification.  If a refinery produces a 91.5 octane product,
the refinery is giving away 0.5 octane.  In
addition to octane, giveaway also exists for other characteristics of gasoline
such as RVP, T50, and TVL (also known as V/L). This giveaway is often not a
one-off or accidental occurrence but is systemic, and it exists for most
gasoline produced at the refinery.

Structural Versus Non-Structural Giveaway

Structural
giveaway
is pre-determined
giveaway caused by constraints like tankage limitations, specification restrictions,
inventory limitations, measurement precision, and octane length/hydrogen demand.  Structural giveaway often cannot be avoided
or requires additional capital expenditure or CAPEX
to prevent.  Fortunately, structural
giveaway only accounts for approximately 70% of the giveaway at a refinery so
there is still an opportunity to save millions of dollars a year without CAPEX.

Non-structural giveaway is giveaway that is preventable with system, process, and behavioural changes. These changes do not require CAPEX and can be implemented in approximately six months. At Trindent, we focus on reducing non-structural giveaway by implementing scheduling, operations, and laboratory improvements:

  1. Scheduling – During scheduling, blends are planned based on several factors, and gasoline recipes are created.  At this stage, increasing confidence in blend predictions allows the planner to reduce buffers and ultimately reduce giveaway.  Trindent works with refineries to improve recipe design by implementing butane planning and management tools, developing statistical based tools for optimizing recipes, validating ethanol uplift models, implementing blend prediction accuracy best practices, and tracking giveaway and other relevant metrics weekly.

  • Operations – Once a recipe is complete, it is sent to operations to be blended.  Blend operators execute the recipe and control the components entering the blend.  Trindent optimizes execution of blend recipes by reducing blend operator variability and implementing blend execution best practices. We reduce variability by providing your workers with enhanced training, standardizing blend adjustments based on blend component properties, creating communication feedback loops, and implementing dashboards that track key performance indicators (KPIs). By reducing blend variability and empowering your workers, we also reduce octane and volatility giveaway.

  • Laboratory
    – The laboratory becomes involved when start-up or mid-blend samples are taken
    and after a blend is complete to validate the blend properties and certify the
    blend. Trindent’s team of Quality Giveaway Consultants provides your
    laboratory with the tools it needs to increase instrument accuracy. To ensure
    analyzer excellence, Preventative Maintenance [note to draft:  Link to Refinery Maintenance Best Practices
    blog] is used to prevent failures, while control charts are used to monitor,
    troubleshoot, and resolve instrument readings.

Refineries
often believe that reducing gasoline quality giveaway is impossible without CAPEX;
however, our experience at more than twenty different refineries has proven
that a zero CAPEX solution exists.

Understanding that quality giveaway is preventable without CAPEX is the first step to reducing giveaway. The second step is understanding what quality targets are right for your refinery and to look for the signs of giveaway.

To learn more about how Trindent can make it happen in your company, reach out to our team on LinkedIn or through our Contact Us page.


Motivating Your Staff to Greatness

"Leadership is the art of getting someone else to do something you want done because he wants to do it.
Dwight Eisenhower

How motivated a person is directly impacts
their output, both from a quality and quantity perspective.  It’s therefore essential that an
organization’s management team not only recognizes the critical role motivation
plays in workplace productivity, but actively seeks out potential sources of
motivation to keep their staff engaged and eager.

Endless research has been conducted in the
field of motivation at all functional levels – biological, psychological, and
sociocultural.  At its most basic level
it exists in every task that we undertake every day, whether we are aware of it
or not.   But not all motivation is the
same, and not all motivation produces the same result. 

Extrinsic Motivation

Extrinsic, or external, sources of motivation are obvious ones, especially in the context of business as they generally align with what is viewed as success in the workplace.  In this context, extrinsic motivation tends to be easily measurable, and is comprised of things like compensation, awards, and benefits.  

Extrinsic motivation is generally driven by
factors like necessity, greed, or societal perception, so it’s less personal, and
therefore viewed as less effective.  But
because of how easily its rewards fit into business, societal, and economic frameworks, it
requires little effort by management to define and quantify, and consequently it’s
the type that companies tend to rely on the most to motivate their workforce.  

Intrinsic Motivation

Intrinsic, or internal, sources of
motivation can be more complex to identify as they come from a person’s individual
set of values and
drivers.  They require management to
review and react to an employees’ behavior, and to take notice of which tasks
interest and excite them.  What the
employees present with pride rather than detachment is an important indicator of
what they feel a personal connection to. 

Intrinsic motivation is significantly more
difficult to identify and implement.  It
takes time, patience and insight for a manager to understand, and cannot be
quantified the way extrinsic motivation can be. 
However, because intrinsic motivation is driven by internal values and
connections rather than external ones, it’s significantly more effective. 

Although more time consuming and effort
filled, having this insight on intrinsic motivators allows managers to tailor
tasks and create personal challenges for employees to give them the right impetus
to excel. 

Introjected Motivation – A Hidden Risk

A subset of intrinsic motivation –
introjected motivation – manifests itself as a sense of guilt or fear of
consequence if, for example, work is not completed or not up to standard.  This is a powerful driver of most human
behavior, especially in people with a strong sense of ethics. 

However, this guilt and fear can create
negative connections for employees to their work environment or the culture of their company.  To avoid the risk of introjected motivation
crossing the line into the negative, management needs to be conscious of
managing expectations and being clear about consequences, but without pressing
the issue more than necessary.  If not
managed properly, introjected motivation can become detrimental, and is a risk
that managers need to be aware of.

Knowing Your Employees

If you can understand what motivates your employees, you can manage them more effectively, increase their productivity, and drive the success of your business.   But how you motive them is essential.  Extrinsic motivators are obvious and easy, but in taking the time to find out what intrinsically drives your employees to want to excel, you can manage them to greatness.  The investment might be greater, but so will the return.


The Benefits of Capacity Planning

One of many advantages of an engagement
with Trindent Consulting is the development of a capacity planning model called
the Area Workload Assessment (AWA), which Trindent custom-builds and calibrates
for each client and each department within the scope of the client engagement. 

Importantly, this is not a single-use tool developed
for the duration of its engagement. 
Instead, it’s a perpetual tool that every manager will be given to use
for staffing projections going forward.

What Is a Capacity Planning Model?

Capacity planning plays a key role in the production lifecycle of an organization.  It’s “the process of determining the production capacity needed by an organization to meet changing demands for its products”.   A capacity planning model, such as Trindent’s AWA is the tool that allows this critical planning function to take place. 

The model takes into account factors like what activities are involved in the process, the best repeatable time to complete each activity, and a historical production volume.  When these main variables are added up, plus a few others added to account for shrinkage, the model calculates what the optimal staffing resource level should be.  A recent example comes from a Trindent engagement at a large insurance firm, where the AWA was deployed to determine how many agents were needed to underwrite a given amount of life insurance policies, while maintaining a certain level of service. 

The model is dynamic and can be used to
compare current and future state capacity. 
A great way to use the AWA is to input forecasted sales volumes along with
current timings in order to project a future state for the capacity required to
maintain operations, which allows an organization to stay ahead of the curve.

The Power – and Benefits – of Capacity
Planning

Wages are one of the largest expenses
incurred by any business, so planning for optimal staffing levels to ensure
proper delivery of goods or services without overpaying for underutilized resources
is a key activity to driving profitability.  Trindent has observed that without proper
goals, targets, and insight on production data, many companies are overstaffing
at a rate higher than their growth, putting a huge strain on the cashflow needed
to cover on wages.  

It is, however, equally important to not
understaff in order to avoid two potential pitfalls. The first is jeopardizing the
ability to meet output targets and maintain service levels, which leads to
reputational risk both with clients and within the industry.  The second is the risk of employee burnout.  By stretching current staff to maintain a
higher than planned volume for prolong period of time can cause increased
stress and decreased morale, leading to absenteeism and turnover.

By equipping managers with the AWA, a
powerful capacity planning tool, Trindent gives them the ability to accurately
utilize current staffing, and keep the business operating optimally.

Click here to read more about how Trindent can assist you to make the most of your capacity planning.


Prototyping Change – How to Succeed Even When You Fail

Since 2008, Trindent Consulting has delivered millions of dollars in process improvement results to clients in the Energy, Healthcare and Financial Services verticals.  With every engagement, Trindent has a blueprint built on past successes to effectively analyze client issues and quickly come up with a list of operational challenges our team needs to tackle.

However, while problems tend to be fairly standard, they rarely have a one-size-fits-all solution.  All clients have a unique set of circumstances that need to be built into any proposed action, which is why Trindent customizes their method change solutions to each of our clients. 

A key part of the process of designing solutions is ensuring
that they are the best ones to achieve the benefits clients are looking for.  This is why Trindent employs the prototype
approach to sample each method change solution we propose before rolling them
out across an entire organization.  

What is Prototyping?

Prototyping is used to evaluate the design of a proposed solution in order to test its viability and scalability.  Simply put, prototyping involves testing the full solution on a small scale – for example, rolling out a process change within one small area of one department, rather than across an entire organization.   This allows for the hypothesis of a solution to be tested, and its constraints and limits to be observed with minimal risk to the overall operation of the business.  

In essence, prototyping is a failsafe.  It mitigates the risk of implementing an
approach across an organization that’s not guaranteed to bring intended results,
which can be costly in time and resources and poses both an operational and
reputational risk to the company.

What Happens When the Prototype Fails?

Before a prototype is launched, metrics are put in place to
define whether its test will be considered successful.  If the metrics aren’t achieved, then the
proposed solution didn’t produce the indented results, and the prototype test is
considered a failure.  

But a failed prototype test and the data collected from it aren’t
waste.  The outcome of the test will be
analyzed to see where it fell short, so that adjustments can be made, and a
revised and more effective method change solution proposed.  In this regard, a prototype failure becomes
part of our path to success. 

What Happens When the Prototype Cannot Be Salvaged?

One of Trindent’s values is perfection with urgency, and on
client engagements, our consultants work to rapidly identify, design, and
recommend changes.  

However, sometimes, despite best efforts, there’s no path for
the prototype to achieve desired results and a decision has to be made to
abandon it.   In these rare cases, the
engagement team quickly returns to the drawing board to refocus the
problem-solving process, using the data from the failed test to steer them towards
a better solution.

When this happens, the advantages of the prototype approach are
apparent.  The “fail fast” methodology of
prototyping allowed us to quickly explore a certain avenue, find out swiftly and
with minimal impact that it doesn’t work, and then rapidly pivot to exploring a
different solution.

Succeeding When You Fail

Henry Ford said, “Failure is the opportunity to begin again,
only this time more wisely.”  

Trindent uses the prototype approach to find success regardless of the outcome of the test.  If the prototyping is successful and the optimal method change solution has been found and can be successfully implemented.  But if the prototype fails, it becomes our opportunity to continue moving towards success, only more wisely. Click here to find out more about Trindent’s approach.


Using KPIs to Optimize the “Psychological Contract”

The concept of the “Psychological Contract” is a set
of unwritten promises and expectations between employer and employee that form
a basis for every employment relationship. 
More precisely, it’s a “deal” from “the perception of the two parties…of
what their mutual obligations are towards each other”.

For an employee, their core obligation is to deliver on assigned outcomes.  For an employer, it’s to provide feedback and give performance evaluations designed to guide employees to succeed.  The effectiveness of that feedback is determined by an employer’s ability to measure whether an employee can deliver against set expectations.  

But for many organizations, there’s a missing link in the process because they lack real visibility to how the KPIs of employee performance are tracking.  Without having necessary data and facts, employers are left interpreting performance rather than judging it objectively, leading to subjective, often unhelpful, feedback and a disruption to the Psychological Contract.  

The Missing Link: 
Dashboards and KPIs

All organizations have some form of outcome targets
for their employees, but how progress is tracked against those targets can make
a substantial difference in how accurately employee performance is managed.  This in turn impacts how successful an
employee can become, and by extension, how productive their whole department is.

The necessary tool to achieve this real time insight
is a robust dashboard to display the main KPIs at both an individual and team
level.   When KPIs come alive on a dashboard, they
become a tangible and a real-time guide for employers to let their employees
know how they are performing, what they are doing well, and where they need to
improve. 

With the ability to measure and visualize these KPIs, leaders can coach and boost productivity both at an individual and team level, and keep employees motivated, goal oriented, and most importantly accountable.  It is often in the coaching/conversation that come from dashboard reviews that employees understand what is required of them in order to meet their obligations.

Fill the Gap to Optimize the Contract

As Peter Drucker said, “if you can’t measure it, you can’t improve it.”

Organizations need to have tools in place so they can be aware of production metrics. A wealth of data is most likely available to any manager, but the goal is to be able to utilize that date in a way that makes it consumable.  Only when this is accomplished and KPIs are available in real time can the missing link in the Psychological Contract be found, one that allows employees to take ownership of their performance and leaders to aim for improvements.


Process. System. Behaviour.

Outdated Process Series

Every process in an organization should be designed to either add value or to be a necessary part of an activity that adds value; and any process that does neither of these should be changed or eliminated.  Previous articles in this series discussed the challenges of identifying outdated and inefficient processes, and looked at observation as a tool for driving business process improvement.   This article will discuss why observing a process in isolation isn’t sufficient to fully understand whether it can or should be changed to drive value.

Each process in an organization generally has a management operating system governing it and the behavior of its participants driving with it. When a process is observed without taking these two components into consideration, the ultimate analysis can only offer a limited view on any of its inefficiencies and shortcomings.  To truly understand the capacity for improvement, we need to use a systematic approach to look at all three elements together.

Management Operating Systems

In order to measure and manage a process, management
operating systems are put in place to track key process performance indicators,
measure process results, and generate performance data, among other functions.  Observing a process cannot be effective if
it’s done independent of whatever management operating system is behind it, and
if the data the system generates isn’t optimal.

When it comes to data, more is not always better, and can, in fact, be worse; and this is a good place to start when determining whether a system is helping or hindering process efficiency.   The ideal type and quantity of data will cover all necessary metrics related value creation without clouding the analysis with unnecessary information that doesn’t drive insight.  Often, organizations mistake collecting a high volume of data for collecting the right data, and end up with an inaccurate understanding of their process productivity and effectiveness.

Process improvements can often by driven by simply
improving the quality and/or quantity of the process related information being
tracked, reported or analyzed by the management operating system. 

Behaviours

Even the most masterfully crafted process can fail if
the appropriate behaviour of its participants is not in place.  Conversely, a poorly designed process may be kept
afloat by talented and dedicated staff.  When
observing a process, it’s necessary to pay attention to the behaviour within
and around it, and how that behaviour affects its effectiveness in order to
truly understand where and how the process needs to be improved.  

When observing behaviour, it’s important to start by
analyzing whether managers have clearly communicated their expectation to their
team and set appropriate targets.  It’s
also necessary to look at whether process participants were adequately trained
and equipped, and whether they were given access to the correct system elements
in order to be able to optimally participate in the process.

Usually, when aspects of behaviour don’t match best practices, the failure doesn’t stem from workers’ poor intentions.  More often then not, when they receive clear instructions and are appropriately trained and equipped, correct behaviours can quickly be implemented to drive process improvements and yield maximum value.

Conclusion:

To truly assess process effectiveness, it’s essential to
look beyond the structure and qualities of the process itself, to evaluate its management
operating system and the behaviours of its participants.  In taking this holistic approach, the
operational shortfalls that need to be addressed will become clear.

To learn more about the approach Trindent uses to identify and correct outdated processes, contact us.


Six Sigma – Striving for the “Perfect” Process

six sigma trindent

Six Sigma methodology is based on a philosophy that strives
towards process perfection, using statistical techniques to measure and
quantify success. 

The symbol for Six Sigma – 6σ
– is the statistical representation of a “perfect” process, but what does this
mean?  A Six Sigma, or “perfect”, process
produces 99.99966% of all its deliverables without defects.  In other words, it sees no more than 3.4
defects per million opportunities.  

Sigma levels can also be seen as a way to show how far a
process or deliverable deviates from perfection.  The closer you get to 6,
reducing process costs and increasing customer satisfaction along the way, the
closer you get to perfection. 

To illustrate Six Sigma better, let’s look at the
improvement cycle and the statistical tools used in Six Sigma.

six sigma DMAIC

Six Sigma’s Improvement
Cycle – DMAIC

DMAIC (Define, Measure, Analyze, Improve and Control) is the
improvement cycle used to drive Six Sigma projects.

Using
the DMAIC framework, once a defect is located and defined, a measurement plan is put in
place to accumulate the necessary data to analyze the defect.  Once the data is analyzed, the process in
question is adjusted to improve its functionality.  And finally, when this process is fully
corrected, the improvements are institutionalized to become the new
standard in the process — establishing control.

Although customarily used as a part of Six Sigma, DMAIC is
not exclusive to the methodology and it can be used as the framework for other
improvement applications

six sigma statistics

Statistically – What is Six
Sigma?

Rooted in statistics, Six
Sigma improvement makes use of standard deviations to display the statistical data
of a particular process. This method uses a formula to measure a process’
potential, effectiveness and likelihood of straying away from perfection –
shown as yield.

Yield % Sigma
Level
99.6540 4.2
99.5340 4.1
99.3790 4.0

As an example, Company A produces 150,000 items a month, 750 of which have defects. The Yield is 99.5% [(150,000 – 750)/150,000 x 100], which is about 4.1 sigma on the abridged Sigma table.

Following
“The Normal (Distribution) Curve” to visualize these calculations, Sigma levels
provide a relative value for components in the process to determine where to
focus efforts to improve the overall operation.

It’s worth noting, the
relative values should not be the sole driver in improvement effort decisions, as they do not
include cost, timeline, or value of that change to the customer.

Perfection with Urgency

The use of Six Sigma, like Lean management or any process improvement methodology, depends on an organization’s goal. 

At Trindent, we understand that one size doesn’t fit all.  Our methodology is based on Lean and Six Sigma but recognizes the considerable part people play in driving process improvement to a successful outcome. The Trindent approach incorporates Lean and Six Sigma methods into a comprehensive set of tools that focus on practical ways to improve processes, systems, and behaviors with an existing infrastructure.


“Where Is My Octane ?” – Recognizing Benefits in Blending Optimization Refinery Projects

By Anas Dabbakh

Anas Dabbakh Blending Optimization

Refineries are a crucial step between raw crude oil and getting the gasoline into your car. There are several chemicals that have to be blended together at the refinery, such as octane, RVP, V/L, T50, etc. in order to create gasoline.

It is important to understand that oil and gas quality giveaway reduction cannot be successfully sustained without some specific changes to refinery planning, gasoline scheduling and execution processes. For example, reduction in octane giveaway will increase the refinery octane pool. Without taking actions to re-balance this pool, the refinery eventually will start giving it away again.

Having gone through multiple blending optimization and quality giveaway reduction projects, I have realized how such projects are beneficial at the refinery. And this is a very interesting question – where did all the saved octane-barrels go?

There are several strategies the refinery can undertake to “tie-up” saved octane  barrels, and make giveaway reduction last:

Change blendstock purchasing from high-octane valuable to lower octane blendstocks.

This could mean changing high-octane valuable to low octane natural gasoline or butane. This may also have additional benefit from increased gasoline production, thereby lowering cost per barrel.

Lower own production of high octane or high value components.

For example, lower Reformer unit severity, or change to a different (less efficient) catalyst, etc.

Consider sales of high octane or high value components in the open market.

If lowering high value components cannot be achieved, or has limitations, then sales of this component in the open market can be considered. Obviously, market conditions should be favorable to such component sales.

Shift production profile

Shifting your production profile to more premium gasoline will consume more octane.

These strategies can be used in combination. Their successful implementation requires close coordination between production planning, scheduling and lab testing.