Unified pricing system

Published on by Joannes Vermorel.

We are pleased to announce that we are upgrading Lokad toward a new pricing for our Forecasting Services. The details have already been published, check it out.

Although your mileage may vary, simulations indicate that this change will represent, on average, a 20% saving for current customers. Our goal remains to stay far ahead of the competition, both in terms of forecasts accuracy but also in terms of TCO (Total Cost of Ownership).

Pricing has been simplified ...

It's simple enough to be expressed with a single compact formula $0.15 * forecasts 2/3 . And, yes, we still have the Power 2/3 coefficient. For those those who don't enjoy mental calculations of cubic roots, we do provide a calculator .

Wait! What do you call a forecast?  If you want sales forecasts for a single product for the next 3 months, one value for each month ahead, it counts as 3. If you repeat your forecasts twice during the same month, it counts as 6. If you do the same with 1000 products instead of a single one, it counts as 6,000 forecasts.

Then, the power 2/3 just acts as a large discount volumes:

  • 1k forecasts cost $15
  • 1M forecasts cost $1,5k (instead of $15k)
  • 1B forecasts cost $150k ( instead of $15M)

Bottom line, it's rather simple. Our inspiration was a mix of the Windows Azure pricing and the Twilio pricing.

In particular, it must be noted that there is no setup fee. Obviously, such a pricing is only made possible because Lokad is powered by cloud computing.

Then, if you happen to use or need Lokad, only once in a while, you wont be charged unless you actually use Lokad. If your account stay idle for 1 month, then you don't get charged at all!

Finally, there is no threshold effect our pricing (thanks to the power 2/3 approach). The more forecasts you need, the higher the costs, but the higher the volume discount too.

Our old pricing system, which had not been revised for almost 2 years, was suffering from one major issue: there were subtleties. Not major ones, still it was sufficient for people to routinely estimates their subscription costs to thousands of $ while it was only less than a hundred.

We are not going to make the same mistake twice. Our pricing page now includes dedicated simulators for inventory optimization and call center optimization. Any doubt about the new costs, just type in you number of SKUs or you number of calling queues.

... but it's still a variable pricing

Many analysts have been expressing concerns about variable pricing in software. How I can get my business plans in order if everything is changing all the time? you might ask.

In our humble opinion, and as far forecasting is concerned, we believe that variable pricing to solving way many more problems than it causes. You might wonder, what will happen if the subscription costs increases? If your Lokad subscription costs increases, it means that your company is growing and so are your forecasting needs.  The last thing you want while undergoing a steady growth is get your forecasts wrong, and let improper planning wreck havoc in your business. Then, our volume discount factor (power 2/3) ensures that the more you grow, the more volume discounts you get from Lokad.

But there is the other situation that analysts usually don't both bother to consider: what if my business is going down, what if we are downsizing, what if branches get sold?

With Lokad, your subscription will be going down accordingly. You will not be stuck with an over-sized on-premise solution to maintain. Pay-as-you-go guarantees that the software you buy today will not accelerate the demise of your business if the economy turns out to be really rough.

Categories: service status, subscriptions Tags: cloud computing forecasting on demand pay as you go pricing 2 Comments

Scalable Forecasting, a Microsoft case study

Published on by Joannes Vermorel.

Reduced first page of the Microsoft case study on Lokad Forecasting has been available as a rationalized business practice for more than a century. Yet, we believe that we when it comes to scalable forecasting, Lokad is one of a kind. We invested a lot of efforts to scale Lokad up during 2009 to address the needs of our largest customers. In particular, we have migrated our technology toward Windows Azure, the cloud computing plateform of Microsoft.

Today, we are at PDC'09, and we are pleased to announce the release of a case study produced by Microsoft about our scalable forecasting technology. Special thanks to the Azure marketing team.

Categories: community, developers, partners Tags: case study microsoft scalability No Comments

Internet is needed for your forecasts

Published on by Joannes Vermorel.

Ethernet cable illustration Do I really need an Internet connection to get your forecasts? is a question frequently asked by prospects having a look at our forecasting technology.

Well, the answer is YES. With Lokad, there is no work-around. Our forecasting engine does not come as an on-premises solution.

But why should we need an internet connection for an algorithmic processing such as forecasting?

The answer to this question is one of the core reason that have lead to the very existence of Lokad in the first place.

When we started working on the Lokad project - back in 2006 -  we quickly realized that forecasting, despite appearances, was a total misfit for local processing.

1. Your can't get your forecasts right without having the data at hand. Researchers have been looking for decades for a universal forecasting model, but the consensus among the community is that there is no free lunch; universal models do not exist, or rather, they tend to perform poorly. This is the primary reasons why forecasting toolkits feature so many models (don't click this link, it's 3000 pages manual for a popular toolkit). With Lokad, the process is much simpler because the data is made available to Lokad. Hence, it does not matter any more if thousands of parameters are needed, as parameters are handled by Lokad directly.

2. Advanced forecasting is quite resource intensive but the need to forecast is only intermittent. Even a small retailer with 10 point of sales and 10k product references represents already 100k time-series to be forecasted. If we consider a typical performance of 10k/series per hour for a single CPU (which is already quite optimistic for complex models), then computing sales forecasts for the 10 points of sales take a total 10h of CPU time. Obviously, retailers prefer not to wait for 10h to get their forecasts. Buying an amazingly powerful workstation is possible, but then does it make sense to have so much processing power staying idle 99% of the time when forecasts are made only once a week? Outsourcing the processing power is the obvious cost-effective approach here.

3. Forecasting is still under fast paced evolution. Since our launch about 3 years ago, Lokad has been upgraded every month or so. Our forecasting technology is not some indisputable achievement carved in stone, but on the contrary, is still undergoing a rapid evolution. Every month, the statistical learning research community moves forward with loads of fresh ideas.  In such context, on-premise solutions undergo a rapid decay until the day the discrepancy between the performance of current version and the performance of the deployed version is so great that the company has no choice but to rush an upgrade. Aggressively developed SaaS ensure that customers benefit from the latest improvements without having to even worry about it.

In our opinion, going for an on-premise solution for your forecasts is like entering a golf competition with a large handicap. It might make the game more interesting, but it does not maximize your chances. Don't expect your competitors to be fair enough to start with the same handicap just because you do.

Categories: business, forecasting, insights Tags: business forecasting insight technology No Comments

Refreshing Min/Max inventory planning

Published on by Joannes Vermorel.

Modeling inventory replenishments

Min/Max inventory planning has been available for decades. Yet, some people argue that Min/Max drive higher costs and that it should be replaced with other methods

Before jumping on conclusions, let’s try to clarify a bit the situation first. For a given SKU (Stock Keeping Unit), the inventory manager needs only two values to specify his inventory management policy:

  • A threshold, named reorder point, which defines if any reorder should be made (Point 3 in the schema).
  • A quantity, named reorder quantity, to be reordered, if any (Point 1 in the schema).

The Min/Max system simply states that:

MIN = ReorderPoint
MAX = ReorderQuantity + InventoryOnHand  + InventoryOnOrder

Thus, as long you’re not carving in stone your Min & Max values, the Min/Max system is perfectly generic: it can express any reorder policy. As far inventory optimization is concerned, adopting the Min/Max convention is neutral, as it’s just way to express your replenishment policy. Contrary to what people seem to believe, Min/Max does neither define nor prevent any inventory optimization strategy.

What about LSSC and Min/Max?

Let’s see how our Safety Stock Calculator can be plugged into a Min/Max framework. The goal is to update the Min & Max values to optimize the inventory based on the forecasts delivered by Lokad.

The calculator reports reorder points. Thus, handling MIN values is rather straightforward since MIN = ReorderPoint. The calculator even lets you export reorder points directly into any 3rd party database. Yet, MAX values are slightly more complicated. The MAX definition states that:

MAX = ReorderQuantity + InventoryOnHand  + InventoryOnOrder

Let’s start with the ReorderQuantity. The safety stock analysis gives us:

ReorderQuantity = LeadDemand + SafetyStock
                             - InventoryOnHand - InventoryOnOrder

Which could be rewritten as:

ReorderQuantity = ReorderPoint - InventoryOnHand - InventoryOnOrder

where ReorderPoint = LeadDemand + SafetyStock Thus,

MAX = ReorderQuantity + InventoryOnHand  + InventoryOnOrder

Becomes

MAX = (ReorderPoint - InventoryOnHand - InventoryOnOrder)
    + InventoryOnHand  + InventoryOnOrder

Which simplifies into MAX = ReorderPoint that is to say MAX = MIN.

Obviously there is something fishy going on here. Did you spot what’s wrong in our reasoning?

Well, we haven’t defined any cost being associated with order operations. Consequently, the maths end up telling us something rather obvious: without extra cost for a new order (except the cost of buying the product from the supplier), the optimal planning involves an infinite number of replenishments, where the size of each replenishment tend to zero (or rather tend to 1 if we assume that no fractional product can be ordered).

Getting back to a more reasonable situation, we need to introduce the EOQ (Economic Order Quantity): the minimal amount of inventory that maintain the expected profit margin on the product. Note that our definition differs a bit from the historical EOQ that is a tradeoff between fixed cost per order and the holding cost.

In our experience, the EOQ is a complex product-specific mix:

  • It depends on volume discounts.
  • It depends on product lifetime, and potentially expiration dates.
  • It depends (potentially) on other orders being placed in the time.
  • ...

Thus, we are not going to define EOQ here, as it would go beyond the scope of this post. Instead, we are just going to assume that this value is known to the retailers (somehow). Introducing the EOQ leads to:

MAX = MIN + EOQ

What’s the impact of EOQ on service level?

Let have another look at the schema. The Point 2 illustrates what happens when the reorder quantity is made larger: the replenishment cycle gets longer too (see Point 4), as it takes more time to reach the reorder point.

Other things being equal, increasing EOQ also increase service level, yet in a rather inefficient way, as it leads to a very uniform increase of your inventory levels that is not going to accurately match the demand.

Thus, we suggest taking the smallest EOQ that maintain the desired margin on the products being ordered.

Categories: insights, safety stock, supply chain Tags: inventory optimization safety stock supply chain 4 Comments