With the rapid advance of innovative technologies in the field of information science and the digital world, all data are increasingly shared over Internet networks. On the other hand, unauthorized access to information or private information has become a problem in the virtual world. Security issues are increasingly coming to the forefront. Encryption and tattooing have become the most effective way to protect against unexpected attacks. However, traditional encryption standards, such as DES and AES, are generally designed only for encrypting text that does not have a high correlation, and are therefore considered unsuitable for images and video data sequences. The new vision of image-based encryption is the use of chaotic sequences for encryption key generation first prescribed by Friedrich in 1998. Since the image has been introduced and processed in digital form, its applications have been steadily increasing. It is now exploited by a wide public, both professional and amateur. However, given the extent of computer resources allowing the free circulation of information and the ease of transmission of confidential data, man has been pushed to increasingly improve encryption algorithms to secure his confidential data. To protect against known attacks, any new encryption system must agree to Shannon’s recommendations ; (Permutation, confusion diffusion). The majority of techniques use static permutations such as Arnold's technique  or advanced Hill's technique . For confusion the Xor operator is the most used . Recently, in order to avoid differential attacks, most algorithms use different encryption methods. Given the advances in mathematical theory, for the generation of encryption keys, all methods use chaotic cards.
Chang’e Dong  offers color image encryption based on the construction of a coupled chaotic map. Xing-Yuan Wanga Sheng-Xian Gua Ying-Qian Zhangab  proposed a crypto system based on a multitude of chaotic maps that define an effective result. All these approaches use a Lyapunov exponent calculation  to check the installation of chaos and sensitivity to initial conditions. Most encryption algorithms operating on blocks used the Feistel scheme with several turns. RC4, RC6, DES used more than four towers . The classical Feistel technique consists in separating a2n-bit block into two blocks of n bit each, this classical method is resumed by the scheme of the figure below.
This figure could be understood by the following evaluation function. Let t denote the quantity of blocks to be encrypted.
(fi)is a n-bit pseudo-random function.
In the absence of the diffusion, this method stays exposed to differential attacks. As a result, this scheme was expanded to include a new scheme by a bijection construction from purely random functions to produce a new encryption scheme , encapsulating confusion-diffusion. Genetic algorithms are based on the Darwinian evolution of biological populations, whose strongest individuals are the most suitable to survive and reproduce very powerful progeny. These algorithms have surfaced as pre-selected evaluation optimization tools for an assessment function. They are based on the following genetic operations: The inversion, the crossover the mutation and the insertion. Several tentative implementations for genetic algorithms for encrypting color images have surfaced [9-10]. Some use DNA sequences  for image encryption, others have used these genetic algorithms to upgrade some conventional encryption systems .
II. THE PROPOSED METHOD
Based on chaos, this technique implements one enhanced Feistel lap followed by a genetic crossover. This new color image encryption scheme focuses on six main axes All these measures are shown in a schematic diagram in the following figure.
All the encryption parameters necessary for the successful execution of our system are generated from three chaotic maps, the most frequently used in the color image encryption. This choice is due to the simplicity of their exploitation and configuration, as well as their extreme sensitivity to the initial parameters.
The logistic map is a recurrent sequence described by a simple polynomial of second degree defined by the following equation
This equation map guarantees that chaos is established to
It is a real linear sequence by pieces defined by the equation below.
It is a very simple map to use in color image cryptography. It presents a chaotic aspect for d ∈ [0.5 1] as control parameters, and w0 ∈ ]0 1[ as initial conditions.
Before transferred the original image to the encryption surgery center, it must be prepared in anticipation, for this it must include the following activities.
After the three (RGB) color channels extraction and their conversion into size vectors (Vr),(Vg),(Vb) (1,nm) each, a concatenation is established to generate a vector X(x1,x2,……,x3nm) of size (1, 3nm)
The resulting vector X(x1,x2,……,x3nm) must be divided into 240 − bit blocks − 30 pixels, therefore its size must be accommodated. Let (l) the new size calculated from the algorithm below.
After, the vector (X) will be transformed into an (TX) vector of size (1,l) by adding 30 – r new chaotic components at the end of the vector (TX), by applying the below algorithm.
The vector (TX) is converted to binary and then a size (t,240) binary matrix (MC) with (t=l/30). This decomposition can be illustrated by the following figure.
Ultimately, the (IV) initialization vector of size (1,240) is provided by the next algorithm.
To surpass the uniform image problem (Black, White …) the vector (IV) will be combined with the chaotic vector (HT) specified by the following algorithm.
The blending of the two vectors is performed by the next algorithm.
This vector has the mission to only modify the value of the first block and start the diffusion confusion process.
(NB): In the absence of such an initialization vector, it is very difficult to follow the encryption scheme correctly.
Each MC(k:) block of order (k) will be subdivided into four identical 60 – bit blocks [MG(k:)GM(k:)MD(K:)DM(k:)] and projected to a first enhanced feistel loop described by the following figure.
This pattern can be analytically expressed by the following mathematical statement.
Initially, a descending sort on the first 60 values of the logistics sequence generates a permutation (PR) in G60. secondly, a chaotic vector (DP) is constructed in parallel to serve as building the permutation matrix (MP).
The first line of the permutation matrix (MP) is the permutation (PR); while line (i≥2) is the displacement of line (i−1) of step DP(i). This construction is described by the following algorithm.
Therefore, the application of the permutation hk on the block (MD(k:)) given by the next algorithm.
gk: is a chaotic offset applied to the 60 bits of the MG(k:) block. This offset is performed by the chaotic vector (DD) resulting by applying the next algorithm.
The analytical expression of such a displacement is illustrated by the following algorithm.
The transformer of first-round diagram block (MC(k:)) given by the algorithm below.
We affirm that the function for the first round is a bijection, its reciprocal is given by the following equation.
A genetic crossover is a pseudo-random function applied to two genes of the same size to form another gene of double size. In our approach, it is a chaotic crossing between two 120-bit vectors to generate a 240-bit block. Firstly, for each block (k) two chaotic vectors (HR) and (RH)are generated by the following algorithm.
The crossing function in our system is defined by the following mathematical formula.
The vector (Q) obtained by the following formula.
The genetic crossing function (Cr) defined by the equation 7 is a bijection. Indeed, we have the block.
Let’s assume (∅) the clear image (MC) encryption function, we have
The encryption process can be illustrated by the following diagram.
This new color image encryption technique follows the steps of the algorithm below.
Clear image vectoring
Adapt the size of image vector
Split into t 240 – bit block
Extract the initialization vector
Confusion with the first block
Perform a genetic crossover on the two output blocks to get the block (IS)
Do (IV) = (IS)
Do k = k+1
If k ≤ t then do MC(k−1:)=MC(k: )
Return to 5
If k > t then restore the encrypted image
Our approach is a symmetrical chaos-based encryption system, so the secret encryption key is also the decryption key. After decomposing the encrypted image into 240-bit blocks and regenerating all encryption parameters, the decryption process starts with the last block by applying the inverse turn function to the second block, then the initialization vector is recalculated to retrieve the first block and restore the original image.
A good system crypto must face all known attacks. For each statistical constant, 150 images are randomly selected according to a chaotic vector from a database of color images of different sizes and formats are tested by our algorithm, and a detailed statistical study has been developed.
If the precision of the computing is 10 decimal digits, then the size of the encryption key in our approach is 1060 ≈ 2180 ≫ 2100 which is more than enough to protect our method from brutal attacks.
The high sensitivity of the encryption keys used in our system indicates that a very slight degradation of the encryption key automatically leads to an image that is so different from the original image. This confirmation can be viewed below the scheme in the figure12.
We note that a 10−12 change in a single encryption parameter of this technology is incapable of restoring the clear image by the same decryption process.
Entropy is the measure of the disorder diffused by a source without memory. The entropy is therefore maximal for a source whose symbols are all equiproable or presenting a flat histogram. The entropy is for an (MC) image of size (n,m), we pose (t=nm), So
The entropy values on the 150 images tested by our method are represented graphically by the following figure.
The entropy values of the images encrypted by our algorithm are around 8, it is the maximum value for a color image encoded on 8 bits. It confirms the uniformity of the histograms. This proves that this approach is safe from entropy attack.
The values derived from the entropy by applying our approach to over 150 images in our image database, constitute a statistical series with position, dispersion and concentration parameters have been recalculated to verify the safety of our approach.
The moustache box of the entorpy is illustrated in the diagram in Figure below.
The Yule coefficient measures the asymmetry of the frequency curve of a statistical series. It is explained by the next equation.
Under these conditions, Yule has demonstrated that
In our entropy study, we found
We note that s ≈ 0.
We can say that the frequency curve is symmetrical.
Flattening is judged by reference to the normal distribution density curve model. We will say that the frequency curve is more or less flattened than the normal distribution model.
The coefficient for quantitatively measuring flattening is called the (Kurtosis). Pearson proposed the following coefficient:
Under these conditions, Pearson has demonstrated that
In our entropy study, we found
We note that β2 ≈ 3.
We can say that our distribution is a normal distribution.
Correlation is a technique that compares two images to estimate the displacement of pixels in one image relative to another reference image. Adjacent pixels of a standard image of a clear image have a strong correlation. A good crypto image system must remove such correlation in order to avoid any statistical attack. The correlation expression is defined by equation below.
Simulations performed on 100 identical-sized color images choose from a wide database of images of various sizes, formats and correlated values are represented graphically by the next figure.
Simulations made on 150 images of the database gave the vertical correlation scores are displayed in Figure below.
Figure 12 shows that the vertical correlation values of the encrypted images are close to zero. This ensures high security against correlation attacks.
Simulations made on 150 images of the database gave the diagonal correlation scores are displayed in Figure 13
Figure 11 shows that the diagonal correlation values of the encrypted images are close to zero. This ensures high security against correlation attacks.
Let be two encrypted images, whose corresponding free-to-air images differ by only one pixel, from (C1) and (C2), respectively. The expressions of these two statistical constants (NPCR)and (UACI) are given by equations 12 and 13, for an image size (n,m).
The NPCR mathematical analysis of an image is given by the equation below.
The UACI mathematicals analysis of an image is given by the equation 36
The study of the 150 selected images revealed the following diagram.
All detected values are inside the confidence interval [99,63 99,95]. These values are largely sufficient to affirm that our crypto system is protected from known differential attacks.
The study of the 150 selected images revealed the following diagram.
All detected values are inside the confidence interval [33,34 33,35]. These values are largely sufficient to affirm that our crypto system is protected from known differential attacks.
The avalanche effect is a required property in virtually all cryptographic hash functions and block coding algorithms. It causes progressively more important changes as the data is propagating in the structure of the algorithm. Therefore, by perturbing a single bit at the input, we can obtain a very different output, (about 1 bit our of 2 changed) explaining the name of this phenomenon. The avalanche effect makes it more difficult to reverse the function due to its chaotic properties (if well designed).
This constant determines the avalanche impact of the cryptographic structure in place. It is approximated by the next equation.
Figure below depicts the evaluation of the AE score for 150 images examined by our approach.
All values returned from the AE by our method are all in the range of residual values [73.96 74.02]. This guarantees that a one-bit change in the clear image will be reflected by a change of at least 78% of the encrypted image’s bits.
The image quality estimation to be based on the pixel change was obtained by processing the PSNR values and the MSE. These are the error metrics used to compare the image and the cipher image.
Mean Square Error MSE: This is the cumulative square deviation between the original image and the additional noise image. When the MSE level is reduced, the error is reduced.
This constant measure the distance between the pixels of the clear image and the encrypted image. It is calculated by the next equation.
(P(i,j)) : pixel of the clear image
(C(i,j)) : pixel of the cypher image
The signal-to-peak noise ratio, often abbreviated PSNR, is a engineering term for the ratio between a signal’s maximum possible power and the power of distorted noise that affects the precision of its display. Since many signals have a very large dynamic range, the PSNR is generally stated in terms of the logarithmic decibel scale. The PSNR mathematical analysis of an image is given by the next equation.
For RGB color images, the definition of PSNR is the same except that the MSE is the sum of all square value changes. In the alternative, for color images, the image is transcoded into a separate color space and the PSNR is displayed for each channel in that color space. The acceptable PSNR values are the real numbers in the domain (5,10). Simulations made on over 150 images of various magnitudes and formats returned the same results as depicted in the figure 17.
All values returned from the PSNR by our method are all in the range of residual values [8,99 8,993]. A statistical study of the dispersions of the PSNR values of the 150 images analyzed by our algorithm reveals the scores presented in the following Table 1:
This table ensures that there is a low dispersion and a high concentration of values in a length interval of 0.001. Moreover, the value of s close to 0 indicates that there is a symmetry in this dispersion and the value of β2close to 3 shows that our dispersion follows a normal distribution. More than 50% of the values achieved are within a longitudinal range of less than 0.0006.
Assuming that the traditional DES and AES encryption algorithms operate in ECB mode, they are vulnerable to statistical attacks and selected plain text attacks. In addition, these two systems require no linking on clear and encrypted blocks and are consequently deficient in the face of differential attacks. In this sense, we will compare the time complexity for reference images with these two crypto systems. In addition to safety parameters, runtime is an important factor in evaluating image encryption system performance. To approve and document the quality of our methodology in a timely fashion. And finally, thanks to these properties, we have selected the "Lena" grayscale image with three different sizes (256×256) (512×512) and (1024×1024). The results are presented in the table below.
We compare our results with the two classical algorithms AES and DES, Classic Hill and Improvement Hill, we can affirm that the time of execution is reasonable. The test was performed on other images of different sizes, and we obtained acceptable scores. This is due to the low algorithm complexity of the implemented algorithms in our strategy.
The large size of our encryption key ensures that the system is protected against any brute force attack. At the same time, the randomness of the genetic operator and the functions of Feistel’s trick make it difficult to unlock the encryption system applied to a given block, increasing the difficulty of the statistical attack. In addition, the high sensitivity to the initial parameters of our three chaotic maps, and the statistical constants calculated in simulation make it difficult to reconstruct the encryption key.
Taking security as its primary objective, this document develops a new encryption framework for color images of arbitrary size. Based on chaos, this technique relies on 24-bit blocks by applying an improved Feistel round accompanied by genetic crossover, followed by chaining to install protection against any known attack. Simulations performed on more than 150 randomly selected images from a large database of color images of different sizes and formats confirm the robustness of our system.