Please use this identifier to cite or link to this item: http://dspace.unimap.edu.my:80/xmlui/handle/123456789/75091
Title: Machine learning-based queueing time analysis in XGPON
Authors: N. A., Ismail
S. M., Idrus
F., Iqbal
A.M., Zin
F., Atan
N., Ali
sevia@utm.my
Issue Date: Dec-2021
Publisher: Universiti Malaysia Perlis (UniMAP)
Citation: International Journal of Nanoelectronics and Materials, vol.14 (Special Issue), 2021, pages 157-163
Abstract: Machine learning has been a popular approach in predicting future demand. In optical access network, machine learning can best predict bandwidth demand so as to reduce delays. This paper presented a machine learning approach to learn queueing time in XGPON given the traffic load, number of frames and packet size. Queueing time contributes to upstream delay and therefore would improve the network performance. Output R acquired from the trained ANN is close to value 1. From the trained ANN, mean squared error (MSE) shows significantly low value and this proves that machine learning-based queueing time analysis offers another dimension of delay analysis on top of numerical analysis.
Description: Link to publisher's homepage at http://ijneam.unimap.edu.my
URI: http://dspace.unimap.edu.my:80/xmlui/handle/123456789/75091
ISSN: 1985-5761 (Printed)
1997-4434 (Online)
Appears in Collections:International Journal of Nanoelectronics and Materials (IJNeaM)

Files in This Item:
File Description SizeFormat 
Machine Learning-Based Queueing Time Analysis in XGPON.pdfMain article1.17 MBAdobe PDFView/Open


Items in UniMAP Library Digital Repository are protected by copyright, with all rights reserved, unless otherwise indicated.