The human eye is an incredibly complex organ. It functions much like a camera, with the cornea, pupil, iris & retina working together to process an image that is sent directly to the brain via the optic nerve – all in the space of about 1/10th of a second! It can distinguish between 10 million colours and has been known to be able to identify people in excess of a mile away. In other words it’s a very cool piece of ‘kit’.
In the animal kingdom eyes have even more impressive capabilities. Seeing distances of over two miles are not uncommon; there are many examples of creatures with excellent night vision (dogs & cats for starters); many insects have compound vision, allowing for almost 360-degree coverage, the list goes on.
So – inevitably – us humans have studied and leveraged these visual attributes to develop artificial, or computer driven, vision systems. A (very) brief history:
- 1960s – first computer vision research started
- 1970s – first studies into vision algorithms such as optical flow & motion estimation
- 1980s – rigorous mathematical analysis is added into the research ‘mix’
- 1990s – first advanced 3D reconstructions & first facial recognition tech
Over the last 25 years or so progress has been steady, accelerated by the smartphone camera wars, but it hasn’t been until the last 5 years or so that things have taken an exponential leap.
The use of AI in the development of vision systems has allowed for huge growth across multiple applications (the best example of which is drone technology); in agriculture it is deployed using vision systems for precision farming – identifying pests, monitoring crop health etc.
These systems work by using cameras and sensors to capture images. The images are then processed using machine learning algorithms to identify patterns and make decisions. Unlike the human eye, AI vision systems can process images in multiple spectrums, such as infrared, which can provide additional information beyond what the human eye can see.
As such the future looks exciting – the use of AI to augment our existing capabilities should have far reaching positive effects, and in the world of agriculture could mean the difference between feeding the world or not.