View Article |
Machine Learning-Based Queueing Time Analysis in XGPON
Ismail, N. A1, Idrus, S. M2, Iqbal, F3, Zin, A.M4, Atan, F5, Ali, N6.
Machine learning has been a popular approach in predicting future demand. In optical access network, machine learning can best predict bandwidth demand so as to reduce delays. This paper presented a machine learning approach to learn queueing time in XGPON given the traffic load, number of frames and packet size. Queueing time contributes to upstream delay and therefore would improve the network performance. Output R acquired from the trained ANN is close to value 1. From the trained ANN, mean squared error (MSE) shows significantly low value and this proves that machine learning-based queueing time analysis offers another dimension of delay analysis on top of numerical analysis.
Affiliation:
- Universiti Teknologi Malaysia (UTM), Malaysia
- Universiti Teknologi Malaysia (UTM), Malaysia
- Universiti Teknologi Malaysia (UTM), Malaysia
- Universiti Teknologi Malaysia (UTM), Malaysia
- Universiti Teknologi Malaysia (UTM), Malaysia
- Universiti Malaysia Perlis, Kampus Pauh Putra, 02600 Arau, Perlis, Malaysia
Download this article (This article has been downloaded 36 time(s))
|
|
Indexation |
Indexed by |
MyJurnal (2021) |
H-Index
|
2 |
Immediacy Index
|
0.000 |
Rank |
0 |
Indexed by |
Scopus 2020 |
Impact Factor
|
CiteScore (1.3) |
Rank |
Q3 (Electrical and Electronic Engineering)) Q4 (Electronic, Optical and Magnetic Materials) |
Additional Information |
SJR (0.298) |
|
|
|