Review Article

Deep Learning on Computational-Resource-Limited Platforms: A Survey

Table 2

Details of representative research works.

DateName (ref. no.)ResourceRepresentative methodArchitecture of NN or topic of machine learningApplication scenarioDataset

Algorithmic2016-4-11[53]Memory capacity, powerInference phase: SVD decomposition-based weight matrix compression, fine-grained task scheduling to processorsAlexNet [76], 2-hidden layer DNN for SpeakerID, SVHN CNN, 2-hidden layer DNN for Audio SceneRecognition of objects, human voice, audio environmentImageNet [76], Speaker Verification Spoofing, and Countermeasures Challenge Dataset [77], SVHN dataset [78], Audio Scene dataset [79]
2016-8-8[60]PowerTraining phase: data projectionunder energy constraint4-Layer DNNImaging, smart sensing, speech recognitionHyperspectral Remote Sensing Scenes [80], UCI Daily and Sports Activities [81], UCI ISOLET [82]
2017-4-17[49]Memory capacityTraining phase: depthwise separable convolution, avoidance of im2col reordering, hyperparameter tuningA 28-layer convolution neural net, PlatNet [87, 88], FaceNet [89, 90]Large-scale geolocation, fine-grained image recognition, face recognition, object detectionImageNet, Im2GPS [83], Stanford Dogs [84], YFCC100M [85], COCO [86]
2017-4-30[39]Memory capacityInference phase: weight encoding, weight sharing, factorization of vector-matrix multiplication2-Hidden layer DNNSpeech recognition, indoor localization, human activity recognition, handwritten digital recognitionUCI ISOLET, UCI UJIIndoorLoc [87], UCI Daily and Sports Activities, MNIST [88]
2018-3-19[41]PowerTraining phase: hyperparameter tuning, GP-Bayesian optimizationVariants of AlexNet for MNIST and CIFAR-10Handwritten digital recognition, image classificationMNIST,CIFAR-10 [89]
2019-1-21[59]Memory access latency, powerTraining phase: transform the DNN realization problem into a Boolean logic optimization problem, Boolean logic minimizationMultiple layer perception [92], CNNHandwritten digital recognitionMNIST
2019-2-28[52]Memory capacityTraining phase: group lasso regularization, intergroup lasso regularizationFully convolutional network with 7 convolution layer initialized with pretrained VGG16Face recognitionLFW face dataset [93]
2019-4-12[58]Memory capacityTraining phase: structured sparsity regularization, Alternative Updating with Lagrange Multipliers (AULM)LeNet [94], AlexNet, VGG-16 [95], ResNet-50 [96], GoogLeNet [97]Handwritten digital recognition, image classificationMNIST, ImageNet
Computational2017-6-18[40]ProcessorTraining and inference phases: enhancing parallelism through computing load granularity altering, network splitting through depth-first traversal methodology, data dimension reduction using dictionary learning, parallelizing with GPUEstablish an universal framework for fitting DL network into specific hardware, AlexNet was used as an exampleImaging, smart sensing, speech recognitionHyperspectral Remote Sensing Scenes, UCI Daily and Sports Activities, UCI ISOLET
2017-6-19[37]Processor, powerInference phase: data caching, hardware-specific code fine-tuning, Tucker-2 matrix decompositionVGG-Verydeep-16 [95], YOLO [98]Continuous vision applicationILSVRC2012 train dataset [99], Pascal VOC 2007 train dataset [100], UCF101 dataset [101], LENA dataset [102]
2017-6-23[64]ProcessorInference phase: fine granularity code execution, parallelizing with GPULSTM model [103]Smart sensingMobile phone sensor dataset [104]
2019-2-1[68]Memory capacityInference phase: fine-grained memory utilizationVGG, CaffeNet [105], GoogLeNet, AlexNetImagingILSVRC2012
Hardware2018-10-4[70]Computing power, powerTraining and inference phases: a unified core architecture, binary feedback alignment (BFA), dynamic fixed-point-based run-length compression (RLC), dropout controllerMDNet [106]Real-time object trackingObject tracking benchmark (OTB) dataset [107]
2018-9-7[72]Computing powerAdopt voltage-charge relationship of electrochemical cells to achieve forgetting parameters, describe the decision-making problem using motion of ionsMultiarmed bandit problems (MBPs)Reinforcement learning
2018-12-7[75]Computing powerQuantum computing, model the correlation in data with underlying probability amplitudes of a many-body entangled stateGenerative modelGenerative model
2019-2-15[73]Computing powerElectrochemical cellsMatrix-vector multiplications based on nonvolatile photonic memoryBasic arithmetic operations for machine learning or AI algorithms
2019-5-9[74]Processing power, powerSeparating the functionalities of data memory and processing, mimic the neurosynaptic system in an all-optical mannerNeural network consisting of four neurons and sixty synapses (and 140 optical elements in total)Letter recognition