What do rare earth minerals and wool have in common?

What do rare earth minerals and wool have in common?

 

 

On the surface there is little in common between these two manufacturing inputs. However, there are two commonalities

First: Australia has plenty of both in its raw form

Second: Australia currently and into the future has little or no chance of being a significant supplier of the end value added product.

Australia remains a significant contributor to the world’s supply of raw wool. In volume we are now second behind China. In value we are the runaway leader after 100 years of genetic management leading to a fine and consistent wool staple, ideal for the manufacturing of high-end clothing. We do only a tiny, artisan level of processing of the raw wool in this country. Over time we have outsourced this dirty, effluent heavy process to India and China.

Sadly, the huge value add to wool occurs after the initial processing of the raw clip, and we are not getting any of it, beyond a few scraps.

In the case of rare earth minerals, we have plenty in the ground, very little of which is being mined currently, and very little of what is mined is processed.

These science fiction sounding minerals occur at very low concentrations, requiring hundreds if not thousands of tonnes of earth being mined and processed to deliver very small amounts of the final product. The subsequent processing is capital intensive, uses toxic chemicals, consumes vast amounts of water and energy, and for neodymium in particular, the critical component of high performance magnets, emits vast amounts of CO2 during processisng. . As a result, we have the raw material, but no way to add the value.

China has a stranglehold on the world supply of these minerals, controlling around 90% of processing and around 70% of the volume of mined material for subsequent processing. Over 20 years China has invested heavily in generating this chokehold on the critical inputs to a modern economy. 20 years start gives them immense price and availability leverage over the industrial activity of the rest of the world, which increasingly requires those science fiction sounding rare earth elements in the manufacture of a vast range of products.

In recent days China has changed the rules on the mining, processessing and export of products made with rare earth elements. The technology required to process the raw materials, and the manufacturing technologies necessary to produce end products are now all subject to licenses being given by the Chinese government. If nothing else, this should scare the wits out of the loonies in the White House.

While building a lather abour rare earch minerals, we should aslo remember the dominence China now has in minerals that are not classed as ‘rare earth’. Managnese, Cobolt, Graphite, lithium, and others.

It would be a brave man to predict any change in this situation in anything less than decades, hundreds of billions invested, and really politically sensative choices being made about the environmental impacts that expansion of non Chinese supply would entail.

The Australian government has announced a ‘Critical minerals strategy’ that includes a Critical Minerals Strategic Reserve. This all sounds appealing, but the acid test will come the first time a mining enterprise proposes to mine an area that is the last habatat of some rare insect, and add CO2 to the atmosphere by establishing a pilot processing plant. The last time the government got involved with supply chain management of a raw material with a view to controlling price and availability was with the wool industry. That ended up as an absolute disaster, and would be logarithnically easier to get right than it will be to bridge the gap with rare earth minerals..

A ‘Critical Minerals Strategy’ sounds like a good idea, is a better sound bite, but is a practical hurdle of enormous proportions. However, China’s dominence should be seen as a challenge to be met with application of the pool of scientific and mining intellect we have in this country. We must find a pathway to making the existing lead China enjoys redundant by the generation and application of scientific understanding, and subsequent development of the technology.to process the stuff in an environmentally sustainable manner.

 

 

 

Focus, competence, and a trip to ROMA

Focus, competence, and a trip to ROMA

Management attention is an investment.

However, I have never seen a calculation of that investment made without the benefit of hindsight. Considering the return on management attention (ROMA) seems to be a sensible element of investment due diligence.

As a consultant I’m always urging clients to focus their resources, time, money, expertise, operational capacity against a narrow field. This focus of resource is always superior to a generalised approach in winning in the short term.

Nowhere are military metaphors more appropriate then in a competitive commercial environment. Every general knows that to win the battle, he needs overwhelming force in a specific space.

However, every general also knows that a war is not won in a single battle. To win the war, you also must be able to adjust to changes in the context in which the war is being waged and respond accordingly.

Years ago, while working for Cerebos, I was responsible for Cerola muesli, now departed from supermarket shelves. In those days there were only a few major SKUs in the breakfast cereal aisle. Wheat Bix, Kellogg’s Corn Flakes, Rice bubbles, and a few other relatively minor SKUs. Muesli was out on the fringes, widely seen as ‘tree hugger food’.

As an extension to Cerola, we created a strategy that straddled the gap between those major cereals and muesli and named it ‘Light & Crunchy’.

We launched it into a test in South Australia. We believed we could build the Cerola brand to be more than just ‘hippie-food’ by creating a new category in the Cereal market. There was an unmet need, a potential gap in the market. That gap could be leveraged (we believed) with a good product and effective marketing programs to generate trial, which would lead to repeat purchase.

The early stages of the test were an enormous success. We easily got retail distribution, consumer trial and repurchase rates that were well above our benchmarks for a successful test.

The significant miscalculation made was not anticipating the weight of the response from Kellogg’s.

It came very quickly with a competitive product called ‘Just Right’, a direct copy of Light and Crunchy. Just Right still exists, which validates our identification of the unmet need. Kellogg’s competitive launch was supported by overwhelming advertising, consumer promotions, and instore promotional support. That massive, focused response by Kellogg’s simply blew us away, and killed any thoughts of continuing.

Kellogg’s saw our test launch of Light & Crunchy as a significant incursion into their territory. They had previously left us alone in Muesli. Research indicated that muesli, as it had been, was not competing for the same consumers who were purchasing Corn Flakes, Rice Bubbles, and Sanitarium’s Wheat Bix.

With Cerola Light and Crunchy, we changed that, and Kellogg’s reacted with extreme aggression. I had failed to anticipate the reaction, which was with the benefit of hindsight, absolutely predictable.

The real lesson was that we did not have what it took to be competent in the breakfast cereal market. While competence is a term that most would see as a measure of skill, in this instance it was more than that. It was a measure also of our depth of knowledge of the market, the competitive drivers that existed, and sufficiently deep pockets to wage a competitive war on Kellogg’s home turf.

Our attention was too focussed on the opportunity we saw in the market, but substantially lacking in attention to the wider competitive context. We had a skewed focus of attention, and the return on that lack of attention taught us a painful lesson.

‘ROMA’. Return on Management Attention, is always a strategic driver, rarely adequately considered.

The fundamental management distinction: Principle or Convention?

The fundamental management distinction: Principle or Convention?

 

 

 

My time is spent assisting SME’s to improve their performance. This covers their strategic, marketing, and operational performance. Deliberately, I initially try and downplay focus on financial performance as the primary measures, as they are outcomes of a host of other choices made throughout every business.

It is those choices around focus, and resource allocation that need to be examined.

Unfortunately, the financial outcomes are the easiest to measure, so dominate in every business I have ever seen.

When a business is profitable, even if that profit is less that the cost of capital, management is usually locked into current ways of thinking. Even when a business is marginal or even unprofitable, it is hard to drive change in the absence of a real catalyst, such as a creditor threatening to call in the receivers, or a keystone customer going elsewhere.

People are subject to their own experience and biases, and those they see and read about in others.

Convention in a wider context, status quo in their own environment.

Availability bias drives them to put undue weight in the familiar, while dismissing other and especially contrary information.

Confirmation bias makes us unconsciously seek information that confirms what we already believe, while obscuring the contrary.

Between them, these two forces of human psychology cements in the status quo, irrespective of how poor that may be.

Distinguishing between convention and principle is tough, as you need to dismiss these natural biases that exist in all of us. We must reduce everything back to first principles, incredibly hard, as we are not ‘wired’ that way.

The late Daniel Kahneman articulated these problems in his book ‘Thinking fast and Slow’ based on the data he gathered with colleague Amos Tversky in the seventies. This data interrogated the way we make decisions by experimentation, which enables others to quantitively test the conclusions, rather than relying on opinion.

That work opened a whole new field of research we now call ‘Behavioural Economics’ and won Kahneman the Nobel prize. Sadly however, while many have read and understand at a macro level these biases we all feel, it remains challenging to make that key distinction between convention, the way we do it, the way it has always been done, and the underlying principles that should drive the choices we make.

As Richard Feynman put it: “The first principle is that you must not fool yourself—and you are the easiest person to fool. So, you have to be very careful about that.

 

 

 

 

 

 

 

 

 

Killing the Seedlings: Is AI a Crisis in waiting for Leadership Development

Killing the Seedlings: Is AI a Crisis in waiting for Leadership Development

 

 

We are so busy debating whether AI will take our jobs, we’ve missed a more dangerous question: what happens when it takes the jobs that create our leaders?

So far, the brunt of automation has fallen on blue-collar roles. Machines took over factory lines, robots handled dangerous or repetitive manual tasks. But the spotlight is shifting. White-collar work, particularly at the entry level, is squarely in the crosshairs of AI. Roles in sales, marketing, law, accounting, admin support, anything process-driven or rule-based are already being swallowed up by bots, templates, and AI agents that never sleep, strike, or slack off.

In past industrial revolutions, we saw enormous upheaval in labour markets. Steam displaced the weavers. Mass production killed off artisans. Electricity reduced manual labour but turbocharged the rise of middle management. Each wave destroyed jobs but also created new ones. That’s the comforting story we tell ourselves.

But this time, the tempo is different. AI is rolling through industries faster than we can repurpose workers. We may eventually find equilibrium, but it’s likely that the rate of job creation will lag the rate of job destruction. And this time, it’s not just jobs on the line, it’s the culture, resilience, and leadership pipelines of entire organizations.

Most of the white-collar roles under threat are entry-level. These are the proving grounds where future leaders learn the ropes, earn their scars, and get spotted by mentors. Strip away those jobs, and what are we left with? A dangerously thin layer of next-gen talent. No feeders. No bench strength. Just a void.

This matters. Organisations depend on a steady flow of energetic, irreverent, risk-taking young guns to shake things up. These outliers challenge orthodoxy, surface new ideas, and eventually rise to reshape the culture. Remove the ground floor, and over time, the whole building becomes brittle.

We don’t yet know the full consequences. But we do have some clues. History is littered with unintended consequences when change is forced onto complex systems.

Consider China’s one-child policy. Designed as a population control measure, it has led to a demographic cliff. Too few young workers. A rapidly aging population. Long-term consequences no one foresaw.

Or nature: rabbits and cane toads introduced to Australia for pest control. Wolves removed from Yellowstone to protect livestock. In each case, the ecosystem was disrupted. Only decades later did we see the cascading damage, and in the case of Yellowstone, the healing when wolves were reintroduced.

The same pattern may emerge in our workplaces. AI may be brilliant at cutting costs and boosting productivity. But if it wipes out the very roles where human potential is first tested and tempered, we could be sowing the seeds of a cultural and leadership vacuum that won’t show up in KPIs until it’s far too late to fix.

 

 

Where is the line between technical innovation and the humanities?

Where is the line between technical innovation and the humanities?

Innovation using physics is forging ahead at an accelerating rate.

Remember the speed at which a covid vaccine was brought to the market after the first identification of the virus. Instead of the usual 10 to 15 years we suddenly had that process compressed into 18 months.

And yet there remained those who refused to accept the vaccination for a range of personal and behavioural reasons which many would say are irrational.

Somewhere the line between the technical innovation involved in the hyper-rapid final stage development of the vaccine and the humanities driving behaviour crashed into each other.

As the rate of technical innovation across every domain accelerates it is likely we will continue to stumble across this barrier to adoption, and a fragmentation of adoption across a range of behavioural parameters.

Simply another social tension driven by the speed at which the modern world is evolving. It is way beyond the speed at which our DNA allows behaviour and attitudes to evolve.

The situation in front of us right now is the degree and manner in which AI is accepted and adopted by organisations and by individuals.

We managed this dilemma in the motor industry as it became obvious that it was profoundly important to incorporate safety into the vehicles as a means to save lives. As a result, it became mandatory to design crumple zones into cars, and install seat belts. Regulatory intervention and oversight 60 years after it became obvious that a car could kill its occupants.

Where will the equivalent crumple zone emerge in the arena of AI, and will it be in time?

Lean thinking drives AI prompt development

Lean thinking drives AI prompt development

 

 

‘Lean thinking’ is a mindset and toolbox to drive optimisation. Little more, beyond the use of common sense and humanity.

Prominent amongst the tools, and the one I probably use the most is ‘5 why’.

AI has given us an entirely new use case that leverages the insights that a 5 why process when done thoughtfully can deliver.

Prompt development.

There are now hundreds of prompt templates and mnemonics emerging from the woodwork, many claiming to be ‘the one’.

All I have seen use a variation of the Lean ‘5 why’ tool.

Most AI users look at the first output of a prompt into any of the LLM tools, and it is sub-par. Generic recitations of what the trained information base reflects as best practice. The beauty of these data driven assistants is that you can push back as much as you like without them taking it personally.

You can point out areas of failure, misinformation, gobbledy-gook, or imagined fairy tales. You can ask for specifics, deeper analysis, sources, or give it examples. The output then improves with each iteration.

You can also ask it what you might have forgotten to ask, or has been missed for some reason, and ask for suggestions. This interrogation of the tool can reveal things you would not have thought of under normal circumstances.

Go through that process 5 times, and in all likelihood, you will not only have something entirely different to the first response, but it will also be infinitely better, and tailored to the need. You will have cleared away the unnecessary, banal, insignificant, and generic, leaving a response that equates to a first principle response to your evolved prompting.

Continuous improvement by AI driven lean thinking.

What a boon!