Main articles: Information theory and Signal processing Coding theory Further information: Coding theory Error detection and correction Further information: Error detection and correction BCH Codes Berlekamp–Massey algorithm Peterson–Gorenstein–Zierler algorithm Reed–Solomon error correction BCJR algorithm: decoding of error correcting codes defined on trellises (principally convolutional codes) Forward error correction Gray code Hamming codes Hamming(7,4): a Hamming code that encodes 4 bits of data into 7 bits by adding 3 parity bits Hamming distance: sum number of positions which are different Hamming weight (population count): find the number of 1 bits in a binary word Redundancy checks Adler-32 Cyclic redundancy check Damm algorithm Fletcher's checksum Longitudinal redundancy check (LRC) Luhn algorithm: a method of validating identification numbers Luhn mod N algorithm: extension of Luhn to non-numeric characters Parity: simple/fast error detection technique Verhoeff algorithm Lossless compression algorithms Main page: Lossless compression algorithms Burrows–Wheeler transform: preprocessing useful for improving lossless compression Context tree weighting Delta encoding: aid to compression of data in which sequential data occurs frequently Dynamic Markov compression: Compression using predictive arithmetic coding Dictionary coders Byte pair encoding (BPE) DEFLATE Lempel–Ziv LZ77 and LZ78 Lempel–Ziv Jeff Bonwick (LZJB) Lempel–Ziv–Markov chain algorithm (LZMA) Lempel–Ziv–Oberhumer (LZO): speed oriented Lempel–Ziv–Storer–Szymanski (LZSS) Lempel–Ziv–Welch (LZW) LZWL: syllable-based variant LZX Lempel–Ziv Ross Williams (LZRW) Entropy encoding: coding scheme that assigns codes to symbols so as to match code lengths with the probabilities of the symbols Arithmetic coding: advanced entropy coding Range encoding: same as arithmetic coding, but looked at in a slightly different way Huffman coding: simple lossless compression taking advantage of relative character frequencies Adaptive Huffman coding: adaptive coding technique based on Huffman coding Package-merge algorithm: Optimizes Huffman coding subject to a length restriction on code strings Shannon–Fano coding Shannon–Fano–Elias coding: precursor to arithmetic encoding[1] Entropy coding with known entropy characteristics Golomb coding: form of entropy coding that is optimal for alphabets following geometric distributions Rice coding: form of entropy coding that is optimal for alphabets following geometric distributions Truncated binary encoding Unary coding: code that represents a number n with n ones followed by a zero Universal codes: encodes positive integers into binary code words Elias delta, gamma, and omega coding Exponential-Golomb coding Fibonacci coding Levenshtein coding Fast Efficient & Lossless Image Compression System (FELICS): a lossless image compression algorithm Incremental encoding: delta encoding applied to sequences of strings Prediction by partial matching (PPM): an adaptive statistical data compression technique based on context modeling and prediction Run-length encoding: lossless data compression taking advantage of strings of repeated characters SEQUITUR algorithm: lossless compression by incremental grammar inference on a string Lossy compression algorithms Main page: Lossy compression algorithms 3Dc: a lossy data compression algorithm for normal maps Audio and Speech compression A-law algorithm: standard companding algorithm Code-excited linear prediction (CELP): low bit-rate speech compression Linear predictive coding (LPC): lossy compression by representing the spectral envelope of a digital signal of speech in compressed form Mu-law algorithm: standard analog signal compression or companding algorithm Warped Linear Predictive Coding (WLPC) Image Compression Block Truncation Coding (BTC): a type of lossy image compression technique for greyscale images Embedded Zerotree Wavelet (EZW) Fast Cosine Transform algorithms (FCT algorithms): compute Discrete Cosine Transform (DCT) efficiently Fractal compression: method used to compress images using fractals Set Partitioning in Hierarchical Trees (SPIHT) Wavelet compression: form of data compression well suited for image compression (sometimes also video compression and audio compression) Transform coding: type of data compression for "natural" data like audio signals or photographic images Vector quantization: technique often used in lossy data compression Digital signal processing Further information: Digital signal processing Adaptive-additive algorithm (AA algorithm): find the spatial frequency phase of an observed wave source Discrete Fourier transform: determines the frequencies contained in a (segment of a) signal Bluestein's FFT algorithm Bruun's FFT algorithm Cooley–Tukey FFT algorithm Fast Fourier transform Prime-factor FFT algorithm Rader's FFT algorithm Fast folding algorithm: an efficient algorithm for the detection of approximately periodic events within time series data Gerchberg–Saxton algorithm: Phase retrieval algorithm for optical planes Goertzel algorithm: identify a particular frequency component in a signal. Can be used for DTMF digit decoding. Karplus-Strong string synthesis: physical modelling synthesis to simulate the sound of a hammered or plucked string or some types of percussion Image processing Further information: Image processing Contrast Enhancement Histogram equalization: use histogram to improve image contrast Adaptive histogram equalization: histogram equalization which adapts to local changes in contrast Connected-component labeling: find and label disjoint regions Dithering and half-toning Error diffusion Floyd–Steinberg dithering Ordered dithering Riemersma dithering Elser difference-map algorithm: a search algorithm for general constraint satisfaction problems. Originally used for X-Ray diffraction microscopy Feature detection Canny edge detector: detect a wide range of edges in images Generalised Hough transform Hough transform Marr–Hildreth algorithm: an early edge detection algorithm SIFT (Scale-invariant feature transform): is an algorithm to detect and describe local features in images. SURF (Speeded Up Robust Features): is a robust local feature detector, first presented by Herbert Bay et al. in 2006, that can be used in computer vision tasks like object recognition or 3D reconstruction. It is partly inspired by the SIFT descriptor. The standard version of SURF is several times faster than SIFT and claimed by its authors to be more robust against different image transformations than SIFT.[2][3][4] Richardson–Lucy deconvolution: image de-blurring algorithm Seam carving: content-aware image resizing algorithm Segmentation: partition a digital image into two or more regions GrowCut algorithm: an interactive segmentation algorithm Random walker algorithm Region growing Watershed transformation: a class of algorithms based on the watershed analogy |
About us|Jobs|Help|Disclaimer|Advertising services|Contact us|Sign in|Website map|Search|
GMT+8, 2015-9-11 22:04 , Processed in 0.143760 second(s), 16 queries .