DES
Data Encryption Standard is basically a symmetric-key algorithm used in the encryption of data of electronic nature. The algorithm was developed in the 1970s by IBM as an improvement of the earlier version by Horst Fesitel. The modified version was approved by the National Bureau of Standards in consultation with the National Security Agency (Kumar and Srivastava 38). The original DES was improved through strengthening against differential cryptanalysis.
DES is a block cipher that transforms a fixed-length plaintext into a cipher text bit string using a key to customize the process to enable protected decryption by individuals with knowledge on the key used in encryption. Usage of DES can only be done safely in the mode of operation. The algorithm is exposed to a brute-force approach attach. However, the level of exposure is minimal since the intruder must know a series of specified chosen plaintexts (Kumar and Srivastava 39).
DES has certification weaknesses and exposed to other attacks such as linear cryptanalysis, differential cryptanalysis, and Davies’ attack. DES is relatively insecure due to its 56-bit size, which is small. In fact, the Electronic Frontier Foundation in collaboration with the distributed.net managed to break the DES key in less than 24 hours in 1999 (Kumar and Srivastava 40).
Moreover, a series of analytical results have demonstrated theoretical weaknesses in DES cipher. DES was used as a federal standard for unclassified data (Kumar and Srivastava 39). It usage has spanned more than 30 years with the latest version called the Triple DES being approved for sensitive information by the government up to the year 2030 (Kumar and Srivastava 40).
Triple DES
Triple DES was first published in 1998 as an improvement of DES. The cipher components include block sizes of 64 bits and key sizes of 112, 168 or 56 bits (Barker 9). The structure of Triple DES is Fesitel network and has 48 DES-equivalent rounds. This algorithm provides three keying options with a key length of 160 bits (Barker 15). The three layers make Triple DES secure and more stable. However, it can only provide an effective security of 112 bits. Triple DES is generally exposed to known-plaintext and chosen-plaintext attacks. Triple DES is used in the electronic payment sector to promulgate and develop standards such as EMV. For instance, Microsoft Outlook 2007, Microsoft Configuration Manager 2012, and Microsoft OneNote use this algorithm to password guard system data and user content (Barker 12).
RIJNDAEL/AES
AES is a specification used in the encryption of electronic data. The algorithm was adopted in the US in 2001 by the National Institute of Standards and Technology. It is basically a sub-set of the Rijndael cipher developed by two cryptographers called Joan Daemen and Vincent Rijmen in 1998 (Mahaveerakannan and Gnana 31). The algorithm has been accepted by the government of the US and other parts of the world.
AES comes in different packages. It is derived from square and has NSA, NESSIE, AES winner, and CRYPTREC certifications. The key sizes are 128, 192, and 256 bits. Its block sizes are 128 bits. The algorithm has substitution-permutation structure network and has 10, 12, and 14 rounds, which are dependent on the key size. AES is exposed to side-channel attacks (Taha 9). However, the 10 to 14 rounds make AES less vulnerable. AES is used by the US government for non-classified documents protection (Mahaveerakannan and Gnana 31).
MARS
Created in 1991 by Jerome Friedman, MARS is a non-parametric technique for regression and often seen as a linear model extension capable of automatically modeling nonlinearities between variables. Model building using MARS occurs in forward and backward pass phases (Chaudhari et al. 441). This makes it easy to use and enable a user to integrate multiple variables at any time. MARS is also very flexible compared to other linear regression models. However, the user is limited in the number of variables to use in the forward pass phase. Moreover, it only permits one of two interaction degrees. The algorithm is used in regression analysis (Chaudhari et al. 443).
RC5
RC5 is a simple cipher of symmetrical-key block nature created in 1994 by Ronald Rivest. Its successors are Akelarre and RC6. RC5’s key sizes are from 0 to 2040 bits while the block sizes are 32, 64, and 128 bits. The structure of RC5 is Fesitel-like network and has 1 to 255 rounds (Ramos 9). This makes the algorithm relatively stronger since encryption involves many rounds, depending on the level of needed security (Mahaveerakannan and Gnana 38). However, the 12-round RC5 is vulnerable to differential attacks when 244 is used as the chosen plaintexts.
RC6
First published in 1998, RC6 was designed and developed by Matt Robshaw, Ron Rivest, Yiqun Lisa Yin, and Ray Sidney. This algorithm is derived from RC5 and has AES finalist certification. The cipher key sizes are 128, 192, and 256 bits while its block sizes are 128 bits (Taha 31). RC6 has 20 rounds and functions on type 2 Feistel network. RC6 is used in NSA implants (Taha 31). For instance, in 2016, the Equation Group revealed several codes used in network security that uses RC6 for communication confidentiality. The multilayer in each round makes this cipher strong and relatively stable (Mahaveerakannan and Gnana 27). However, RC6 is also exposed to differential attacks as its predecessor.
Serpent
First published in 1998, Serpent is a symmetric cipher designed and developed by Lars Kudsen, Eli Biham, and Ross Anderson. Serpent is derived from Square and has AES finalist certification. Its key sizes are 128, 192, or 256 bits while block sizes are 128 bits. The structure of Serpent is substitution-permutation network and comes with 32 rounds (Mahaveerakannan and Gnana 51). The numerous public attack trials have not succeeded to penetrate the full-32 rounds cipher. Trial attacks in 2011 only managed to break 11 rounds. This makes this cipher secure and relatively predictable. However, an effective XSL attack might weaken Serpent (Graves and Graves 23). This cipher is available for public use since there are no encumbrances with regards to user restrictions.
Blowfish
Blowfish is also a symmetric-key block cipher that was designed and created by Bruce Schneier in 1993. This cipher is known to provide a stable encryption in software. Since its creation, there has never been am effective cryptanalysis. Blowfish is multi-purpose algorithm that improves on the challenges of DES (Mahaveerakannan and Gnana 13). Since it is unpatented, the cipher is available for use across the globe in the public domain. Its successor is Twofish.
Blowfish’s key sizes are from 32 to 448 bits and block sizes are 64 bits. The cipher has 16 rounds and functions on Feistel network. This makes the cipher difficult to penetrate. However, the first four rounds of Blowfish are vulnerable to differential attack of second order (Wang et al. 1272). Moreover, the 64-bit size makes this cipher susceptible to birthday attacks, especially in HTTPS context.
Twofish
Also designed and created by Bruce Schneier in 1998, Twofish is a symmetric-key block cipher derived from Square, SAFER, and Blowfish. This cipher is related to Threefish and has AES finalist certification. Its key sizes are 128, 192 or 256 bits while block sizes are 128 bits. Twofish has sixteen rounds and structured in Feistel network, which makes its secure as the multilayer keys are difficult to penetrate (Wang et al. 1271). Its distinctive features have made this cipher a complex key schedule to integrate the Maximum Distance Separable matrix. Twofish is available in public domain. This cipher is exposed to impossible differential attack, which is capable of breaking the first six rounds.
Threefish
Threefish was first published in 2008 and developed by Bruce Schneier, Jesse Walker, Doug Whiting and others. This cipher is related to Blowfish and Twofish. Its key sizes are 256, 512, and 1024 bits and block sizes are 72 bits. Threefish has a speed of 6.1 cpb on core 2. This cipher has a mix function and permutation steps change position of texts according to preset constant pattern. This makes Threefish secure. However, it is exposed to rebound attack, which affects its Skein hash function as established in 2010 (Graves and Graves 24). Moreover, Threefish is susceptible to bommerang attack, especially for its 32-round version.
IDEA
First published in 1991 and designed by James Massey and Xuejia Lai, IDEA is a symmetric-key block cipher created as a replacement of DES. The design was sponsored by the Hasler Foundation and is currently available freely for non-commercial usage. This cipher has been used in the BassOmatic, Pretty Good Privacy and is available optional Open PGP algorithm standard. IDEA’s successors are MESH, MMB, IDEA NXT, and Akelarre.
Its key and block sizes are 128 bits and 64 bits, respectively. IDEA has 8.5 rounds and Lai-Massey scheme structure (Graves and Graves 61). The differential cryptanalysis by its designers revealed that IDEA is immune algebraic or linear weaknesses. A trial in 2013 to recover the key revealed that IDEA is vulnerable to computational complexity attack with the use of narrow bicliques. Moreover, IDEA’s simple key structure makes it to generate weak encryption.
CAST-128
First published in 1996, CAST-128 was designed by Stafford Tavares and Carlisle Adams as a symmetric-key block cipher. It has been used in different products as the primary default cipher in PGP and GPG. CAST-128 successor is CAST-256 and has 12 or 16 rounds (Lobo and Lakshman 16). The key and block sizes are 40 to 128 bits and 64 bits, respectively. This makes the algorithm secure and easy to integrate. However, it is susceptible to differential and Brute-force attacks.
CAST-256
First published in 1998, CAST-256 is a symmetric-key block cipher as a replacement of AES. However, it did not make it to the final list of preferred algorithms. It is derived from CAST-128 and has 48 rounds (Graves and Graves 62). CAST-256 structure is generalized type 1 Feistel network and has key sizes of 128, 160, 192, 224, and 256 bits. Its block sizes are 128 bits (Lobo and Lakshman 13). The many rounds make this cipher safe and secure. However, it is known to be vulnerable to zero-correlation cryptanalysis using a secret key.
Camellia
First published in 2000, Camellia is derived from E2 and MISTY1. This cipher was designed by Mitsubishi Electronic in collaboration with NTT and has NESSIE and CRYPTREC certifications. The cipher is used for hardware and software implementations ranging from low-cost to high-speed smart cards and network systems. Camellia is integrated in the Transport Layer Security to offer communication security in computer networks (Graves and Graves 78).
This cipher has Feistel network structure with key and block sizes of 128, 192 or 256 bits and 128 bits, respectively. Camellia has 18 or 24 rounds. This cipher is considered safe, modern, and infeasible to penetrate even by Brute-force attack (Zhang et al. 14). At present, there has never been a successful attack on this cipher. Camellia is used by the Japanese CRYPTREC project, the EU’s NESSIE and the IEC/ISO (Fouda et al. 586).
DEAL
DEAL is a symmetric-key block cipher designed by Lars Knudsen and published in 1998. This cipher is derived from DES and related to Ladder-Des. Its key and block sizes are 128, 192, or 256 bits and 128 bits, respectively (Graves and Graves 35). DEAL has 6 or 8 rounds with Nested Feistel network. The cipher many rounds makes it safer. However, it is exposed to Brute-force and differential attacks, especially in low keys (Wang et al. 28).
LOK197
Designed by Lawrie Brown, Jennifer Seberry, and Josef Pieprzyk, LOK197 is a symmetric-key block cipher created in 2001. This cipher is relatively safe and has multiple usages in securing electronic data (Zhang et al. 14). However, it is exposed to Brute-force and differential attacks, especially in low keys. However, the multiple layers in each round make the cipher relatively secure (Su et al. 244). The encryption algorithm has low reliability, thus, limiting its usage in software and hardware support.
DFC
Decorrelated Fast Cipher (DFC) is also a symmetric-key block published in 1998 by a conglomerate of researchers drawn from France Telecom, CNRS, and Ecole Normale Superieure. DFC is related to COCONUT98 and has 8 rounds. Its key and block sizes are 128, 192, or 256 bits and 128 bits, respectively (Graves and Graves 13). The cipher is exposed to timing, differential and linear at tacks due to its low native capabilities in low key (Lobo and Lakshman 45). However, in high key, DFC is secure and allows for many cipher parameter choices using modified keys schedules to phase out weak keys.
MAGENTA
First published in 1998, MAGENTA is a symmetric-key block cipher designed by Klaus Huber and Michael Jacobson. It has 6 or 8 rounds with key and block sizes of 128, 192, or 356 bits and 128 bits, respectively. MAGENTA’s structure is Feistel network (Albers and Mazur 45). This cipher is used for general encryption and support of network telecommunication apps. However, it is slower in low key and exposed to differential attacks. However, MAGENTA is relatively secure or immune under specific protocols.
E2
E2 cipher is a 12 rounds symmetric-key block that was published in 1998 and designed by NTT (Lobo and Lakshman 21). Its successor is Camellia and has key and block sizes of 128, 192, or 256 bits and 128 bits, respectively. Unlike some ciphers, E2 has an output and input transformations that use modular multiplication, thus, multiple usages (Zhang et al. 14). However, its round function is limited to S-box and XORs lookups. Most of E2 component have been integrated in Camellia.
CRYPTON
Designed by Chae Hoon Lim and first published in 1998, CRYPTON is a symmetric-key block cipher created to replace AES. This cipher is relatively efficient, especially in hardware implementations (Albers and Mazur 19). For instance, Future Systems Incorporation has successfully used this cipher in their hardwares. CRYPTON has four steps in its round transformation consisting of column-wise, byte-wise, column-to-row, and final-key. This cipher uses 12 rounds and has substitution-permutation network. It is derived from Square. CRYPTON has key and block sizes of 128, 192, or 256 bits and 128 bits, respectively (Lobo and Lakshman 45). However, this cipher is weak in low key and exposed to Brute-force and differential attacks.
Statistical Test
NIST Tests
This is a statistical package with fifteen tests developed to check the randomness of binary sequences that are produced either by software or hardware-based pseudorandom or cryptographic generators (Albers and Mazur 19). All the tests are focused on different varieties of non-randomness that might exist within a sequence. First implemented in 1987, NIST tests have facilitated the development of rich transcription in addition to documenting the past and present state (Lobo and Lakshman 31). NIST tests have evolved throughout the years in domains that different but complimentary in nature (Chai et al. 203).
Among the notable tests are frequency (monobit) test, frequency test within a block, run test, test for the longest run of ones in a block, binary matrix rank test, discrete fourier transform (special) test, non-overlapping template matching test, overlapping template matching test, Maurer’s universal statistical test, linear complexity test, serial test, approximate entropy test, cumulative sums (Cusums) test, random excursions test, and random excursions variant test (Zhang et al. 14).
The order of running these tests may vary from time to time depending on the output and intention (Lobo and Lakshman 49). However, it is generally suggested that the frequency test should be first since it is capable of providing basic evidence on existence or nonexistence of randomness within a sequence. In the event that this test fails, it is almost certain that all other test will fail (Albers and Mazur 19).
Diehard Tests
First published in 1995, diehard tests consist of several statistical tests used to measure the random number generator quality. These tests were created by George Marsaglia over the years (Wang et al. 1273). There are sixteen tests, which include birthday spacings, overlapping permutations, ranks of matrices, monkey tests, count the 1s, parking lot test, minimum distance test, random spheres test, squeeze test, overlapping sums test, runs test, craps test, binary rank test, bitstream test, tests DNA, OQSO, and OPSO, and 3D sphere test among others (Albers and Mazur 19). Most of these tests return a p-value, which has to be uniform on coordinates [0, 1], especially if and only if the input variable has independent bits that are random in nature (Niu et al. 9).
These p-values are derived by p=F(X). In this case, F connotes the assumed distribution within the sample variable X. In application, the assumed F is often an asymptotic estimation. This means that there are incidences when p-values are close to 0 or 1(Zhang et al. 14). These tests are used to benchmark and test random number generators. For instance, running all the tests might be instrumental in creating a user-controlled report.
The results may then be used in formatting the test power and multiplier in default number sequences. In their binary mode, the diehard tests are instrumental in causing the output ran to be transcribed in raw binary and not as formatted ascii (Lobo and Lakshman 41). Moreover, the output flag in diehard tests permits the selection of fields for inclusion in the final output. This means that each flag may be entered as an independent binary number capable of turning a specific header or output field by flag name. In addition, these tests are significant in resolving ambiguity. For instance, a diehard test with weak or undesirable results would pinpoint a problem in the inputs (Wang et al. 1272).
In order to avoid this, a series of diehard test could be used to examine infrequent weak returns since the p-value is uniformly distributed. Therefore, running several tests would confirm if the undesirable results are reproducible or just an extreme value that could be ignored (Liu et al. 112). In the end, these tests will eliminate any preexisting bias influenced by personal judgment of assuming a small and unlikely default threshold of failure.
ENT Tests
Developed over the last three decades, ent is a program created to rest the sequences of bytes within a file and create a report from the results of these tests. The ent program is important in evaluation of pseudorandom generator of numbers for compression algorithm and statistical sampling and encryption applications (Lobo and Lakshman 45). The program performs a series of tests on input to produce output following a standardized output stream (Zhang et al. 14).
The values calculated are derived through entropy, Chi-square test, arithmetic mean, Monte Carlo value for Pi, and serial correlation coefficient using options b, c, f, t, and u (Mehler and Romary 58). The entropy is used to examine the density of information at random intervals in order to permit or disallow compression of a file. Since Chi-square test is highly sensitive to errors, it use “indicates how frequently a truly random sequence would exceed the value calculated, which is interpreted as the degree to which the sequence tested is suspected of being non-random” (Wang et al. 1272).
The arithmetic mean indicates degree of high or low values for consistency. Lastly, the serial correlation coefficient “measures the extent to which each byte in the file depends upon the previous byte” (Zhang et al. 1058). However, in the event that an infile is not specified, the ent program derives its input function from the standardized input (Lobo and Lakshman 49).
TestU01 Tests
TestU01 is basically a software library created in ANSI C language. The library offers a series of utilities for random number generators and empirical testing of randomness. Among the notable TestU01 tests are Small Crush (consisting of other ten tests), Crush (consisting of ninety six tests), and Big Test (consisting of one sixty tests).The development of TestU01 span to more than five decades with the initiatives Donald Knuth in 1969 (Lobo and Lakshman 45).
These tests were improved by George Marsagalia in 1996 with his proposed 15 tests. Features of TestU01 consist of four modules in the form of implementing RNGs, specific statistical tests, batteries of tests, and entire RNGs family tests. However, the use and applicability of the TestU01 is limited to 32-bit inputs, which are then interpreted as p-values within the range of (0,1). As a result, it is sensitive to direct and indirect flaws, “in the most-significant bits than the least significant bits” (Ye et al. 421).
Works Cited
Albers, Michael, and Mary Mazur, editors. Content and Complexity: Information Design in Technical Communication. Routledge, 2014.
Barker, Elaine. NIST Special Publication 800-57: Recommendations for Key Management Part 1: General. National Institute of Standards and Technology, 2017.
Chai, Xiyan, et al. “A Novel Chaos-Based Image Encryption Algorithm Using DNA Sequence Operations.” Optics Lasers in Engineering, vol. 88, 2017, pp. 197– 213.
Chaudhari, Sampni, et al. “A Survey of Methods of Cryptography and Data Encryption.” Imperial Journal of Interdisciplinary Research (IJIR), vol. 3, no. 11, 2017, pp. 440-444.
Fouda, Armand et al. “A Fast Chaotic Block Cipher for Image Encryption.” Communications in Nonlinear Science and Numerical Simulation, vol. 19, no. 3, 2014, pp. 578–588.
Graves, Heather, and Roger Graves. A Strategic Guide to Technical Communication – Second Edition (US). 2rd ed., Broadview Press, 2014.
Kumar, Sanjay, and Sandeep Srivastava. “Image Encryption Using Simplified DAATA Encryption Standard (S-DES).” International Journal of Computer Applications (0975-887), vol. 104, no. 2, 2014, pp. 38-42.
Liu, Yuang et al. “Cryptanalyzing a RGB Image Encryption Algorithm Based on DNA Encoding and Chaos Map,” Optics and Laser Technology, vol. 60, 2014, pp. 111–115.
Lobo, Lancy, and UmeshLakshman. CCIE Security v4.0 Quick Reference: Cisc CCIE Secu v4.0 Qui ePub_3. 3rd ed., Cisco Press, 2014.
Mahaveerakannan, Renganathan, and Suresh Gnana. Customized RSA Public Key Cryptosystem Using Digital Signature of Secure Data Transfer Natural Number Algorithm. Center for Programming, 2014.
Mehler, Alexander, and Laurent Romary, editors. Handbook of Technical Communication. Walter de Gruyter, 2014.
Niu, Yuan, et al. “Image Encryption Algorithm Based on Hyperchaotic Maps and Nucleotide Sequences Database.” Computational Intelligence and Neuroscience, vol. 5, no. 3, 2017, pp. 1-9.
Ramos, Jose. “Futures Action Model for Policy Wind Tunneling”. Action Foresight. 2017. Web.
Su, Wang et al. “Security Evaluation of Bilateral-Diffusion Based Image Encryption Algorithm.” Nonlinear Dynamics, vol. 77, no. 1-2, 2014, pp. 243–246.
Taha Mahmoud M. Reda, editor. International Congress on Polymers in Concrete (ICPIC 2018): Polymers for Resilient and Sustainable Concrete Infrastructure. Springer, 2018.
Wang, Wei et al. “A Novel Encryption Algorithm Based on DWT and Multichaos Mapping.” Journal of Sensors, vol. 4, no. 7, 2014, pp. 17-34.
Wang, Yuan, et al. “A Novel Image Encryption Scheme Based on 2-D Logistic Map and Sequence Operations,” Nonlinear Dynamics. An International Journal of Nonlinear Dynamics and Chaos in Engineering Systems, vol. 82, no. 3, 2015, pp. 1269–1280.
Ye, Gyuang. “A Block Image Encryption Algorithm Based on Wave Transmission And Chaotic Systems.” Nonlinear Dynamics, vol. 75, no. 3, 2014, pp. 417–427.
Zhang, Xhiuan et al. “Fluorescence Resonance Energy Transfer-Based Photonic Circuits Using Single-Stranded Tile Self-Assembly and DNA Strand Displacement.” Journal of Nanoscience and Nanotechnology, vol. 17, no. 2, 2017, pp. 1053–1060.
Zhang, Xuncai et al. “Chaotic Image Encryption Algorithm Based on Bit Permutation and Dynamic DNA Encoding.” Computational Intelligence and Neuroscience, vol. 6, no. 12, 2014, pp. 12-18.