An auto-encoder which can be split into two parts is designed. The two parts can work well separately. The top half is an abstract network which is trained by supervised learning and can be used to classify and regress. The bottom half is a concrete network which is accomplished by inverse function and trained by self-supervised learning. It can generate the input of abstract network from concept or label. It is tested by tensorflow and mnist dataset. The abstract network is like LeNet-5. The concrete network is the inverse of the abstract network. Through test, encryption and decryption can be achieved by abstract network and concrete network add jump connection and negative feedback with absolute function. When binary encoding is used, although the encrypted vector is four bits, the result of decryption has the same quality as one-hot encoding because of the jump connection and negative feedback. The parameter of DNN is secondary key, the architecture of DNN is primary key. Secondary key can be shared by all the people, primary key can be shared by sender and receiver. The key can be generated by training the DNN. When big dataset is used for encryption, the classes are far bigger, the label may be something in the world, numbers, words, or attributes represented by float number. The label can use the mix of one-hot encoding and binary encoding, it is harder to attack. Through analysis, it is safe for most situations.
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
Tel.: +1 703 830 6300
Fax: +1 703 830 2300 email@example.com
(Corporate matters and books only) IOS Press c/o Accucoms US, Inc.
For North America Sales and Customer Service
West Point Commons
Lansdale PA 19446
Tel.: +1 866 855 8967
Fax: +1 215 660 5042 firstname.lastname@example.org