Please use this identifier to cite or link to this item: http://dspace.unimap.edu.my:80/xmlui/handle/123456789/69271
Full metadata record
DC FieldValueLanguage
dc.contributor.authorTahaBasheer, Taha-
dc.contributor.authorPhaklen, Ehkan-
dc.contributor.authorRuzelita, Ngadiran-
dc.date.accessioned2021-01-07T09:46:12Z-
dc.date.available2021-01-07T09:46:12Z-
dc.date.issued2017-
dc.identifier.citationMATEC Web of Conferences, vol.140, 2017, 5 pagesen_US
dc.identifier.issn2261-236X (online)-
dc.identifier.urihttp://dspace.unimap.edu.my:80/xmlui/handle/123456789/69271-
dc.descriptionLink to publisher's homepage at https://www.matec-conferences.org/en_US
dc.description.abstractPerceptual mappingapproaches have been widely used in visual information processing in multimedia and internet of things (IOT) applications. Accumulative Lifting Difference (ALD) is proposed in this paper as texture mapping model based on low-complexity lifting wavelet transform, and combined with luminance masking for creating an efficient perceptual mapping model to estimate Just Noticeable Distortion (JND) in digital images. In addition to low complexity operations, experiments results show that the proposed modelcan tolerate much more JND noise than models proposed beforeen_US
dc.language.isoenen_US
dc.publisherEDP Sciencesen_US
dc.relation.ispartofseries2017 International Conference on Emerging Electronic Solutions for IoT (ICEESI 2017);-
dc.subjectJNDen_US
dc.subjectAccumulative Lifting Difference (ALD)en_US
dc.subjectLifting Wavelet Transform (LWT)en_US
dc.subjectPerceptualMappingen_US
dc.titleA new perceptual mapping model using Lifting Wavelet Transformen_US
dc.typeArticleen_US
dc.identifier.doihttps://doi.org/10.1051/matecconf/201714001036-
Appears in Collections:Phaklen Ehkan, Assoc. Prof. Dr.
School of Computer and Communication Engineering (Articles)

Files in This Item:
File Description SizeFormat 
A New Perceptual Mapping Model Using Lifting Wavelet Transform.pdfMain article644.29 kBAdobe PDFView/Open


Items in UniMAP Library Digital Repository are protected by copyright, with all rights reserved, unless otherwise indicated.