To him, a picture is a set of pixels, each of which has a brightness value and a color value. In order for the machine to understand what is displayed, it is necessary to use special algorithms. First, the picture defines the boundaries between proposed objects. Various methods are used for this, including the difference of Gaussian ( , ) algorithm. In this case, the original image is Gaussian blurred several times, and each time with a different filter. Furthermore, the obtained results are compared to identify the most contrasting segments, which most often prove to be boundaries between objects. Afterwards, important locations of the pictures are converted into digital form. The output is a descriptor number.
Graphical image recording in phone number list word form. Thanks to the comparison of descriptors, a picture is recognized as its analogue. To create descriptors for computer vision operations, various algorithms are used, among others. Types of Computer Graphics and Their Features Read also Types of Computer Graphics and Their Features More Image descriptors are represented by a large string of numbers, and if the full representation is used, matching of the numbers will require a lot of program resources. Therefore, to speed up computation, descriptors are grouped into clusters. This operation is called clustering. After clustering, the main work is transferred to the clustering level, and this transformation from descriptors to clustering is called quantization.

The cluster number itself is specified as a quantization descriptor. Quantization significantly speeds up data processing. The computer recognizes objects and is able to compare images, referring to the quantified descriptors it processes. In the example above, the machine determines which clusters are indicative of the presence of a given object (traffic light). Once the key markers are found, further image recognition does not pose any problems for computer vision. Likewise, duplicates are searched based on the clusters of loaded images. How Computers Learned to Recognize How Computers Learned to Recognize This approach to image processing is by no means the only one; there are other computer vision techniques that use algorithms of one kind or another to make the system function.