Search
Close this search box.

A blog by Rudolf de Vetten, Chief Product Officer at Blue Radix.

Two weeks ago, we had an interesting discussion with three growers at our office about autonomous growing with Crop Controller. We discussed how they experience working with AI and what it means to them as a grower to trust an algorithm. Are you surrendering to a black box? To me, the term black box has a negative connotation. You don’t want to trust something that you don’t understand. Rather a white box: a fully transparent technique, whose every step you can follow.

The question is: does it work that way? Do you not trust technology until you fully understand how it works? You start your car and before you know it the wheels are turning. Some of this process you understand, but much of it is complex technology that you have no idea about. Yet you entrust your life to it.

The basic principle at Blue Radix is that we develop algorithms which people find easy to use and which add real practical value. Our team consists of many specialists with a mathematical background and in-depth knowledge of algorithms. To do their job well, they work closely with growers, crop specialists and colleagues who specialize in developing user-friendly screens. This interaction produces solutions that are not only technically ingenious, but also explainable and usable by growers in practice. This is not a retrograde step but lies at the heart of every development we do.

We avoid the black box effect with various interventions: by breaking down the grower’s problem into sub-steps. By providing insight into the data that an algorithm enters and exits. And by giving the grower influence and always taking their own strategy as the starting point. At the same time, we are always looking for new improvement steps that will enable us to achieve results that cannot be achieved by people.

In addition, algorithms offer many opportunities to gain insight into how a decision is made. This often provides surprising insights, also for the grower. As humans, we often think that a certain factor is crucial, although data analysis shows that it only plays a limited role. Rain is a good example: we recently analyzed the effect of rain on greenhouse temperature. Rain has an effect, but it is very subtle: rain mainly affects the outside temperature, which in turn affects the inside temperature. Intuitively, everyone says that rain has a direct relationship with greenhouse temperature. Your own intuition is certainly not always easy to explain, let alone always correct.

The question around the explainability of AI models is valid. Simply accepting that it is a black box is a unnecessary compromise. That was also one of the conclusions in our conversation with the growers. A good balance between explainability, interest and trust based on the results is essential. In this way, working with algorithms becomes possible for every grower. At the same time, it is good to ask yourself: ‘To what extent am I a black box myself?