|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Information Technology
November 2019 | Vol. 97
No.21 |
Title: |
ARABIC TEXT CLUSTERING BASED ON K-MEANS ALGORITHM WITH SEMANTIC WORD EMBEDDING |
Author: |
HASNAA R. H. SOLIMAN, MOHAMED GRIDA, MOHAMED HASSAN |
Abstract: |
With the massive growth of Arabic content on the web, clustering of the Arabic
textual data into a small number of meaningful groups becomes an essential
component in various information retrieval applications, such as recommender
systems, sentiment analysis, question answering systems, and search engines.
Clustering methods, which are traditionally based on bag of words (BOW) model
for text representation, do not consider the order relationships between terms
and may result in unsatisfactory clusters especially with complex languages as
Arabic. This study introduces a model for enhancing the accuracy of Arabic
document clusters by integrating the K-means clustering algorithm with embedding
approaches, including Word to Vector (Word2Vec) as a representational basis
instead of BOW to capture the semantic information between individual terms. The
model performance in the clustering news dataset utilized in previous similar
studies was investigated. Accordingly, it was concluded that combing embedding
techniques with the k-means algorithm improves the various evaluation measures
of clustering as purity, F-measure, and accuracy. |
Keywords: |
Arabic Text Clustering, Document Embeddings, Word Embeddings, Doc2vec, Word2Vec. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
THE IMPACT OF TECHNOLOGY, ORGANIZATIONAL, AND TRUST FACTORS ON SOCIAL COMMERCE
ADOPTION |
Author: |
WALID ABDULLA ALI , MURIATI MUKHTAR , IBRAHIM MOHAMED |
Abstract: |
The utilization of Web 2.0 applications in supporting peoples’ online
interactions where the users’ contributions enable firms to lower cost, extend
market reach and improve efficiency is termed social commerce. Businesses are
able to set up an online community and provide a safe haven for consumers to
share experiences, knowledge and data about their services and products. Yet,
small and medium sized enterprises of developing countries are unable to quickly
adopt social commerce. Thus, it is imperative to analyze the scenario and define
the reasons that influence the implementation of social commerce. As such, via a
thorough literature review, this study’s objective is to define the
organizational, technological and trust factors that influence the adoption of
social commerce (SC), and the way in which implementation of social commerce
influences organization’s performance. There is also a suggestion of a
conceptual framework following the strengths of three models: Unified Theory for
Acceptance and Use Technology (UTAUT), Information Systems Success Model and
Technology, Organization and Environment (TOE). Hopefully, the suggested
framework may assist in deciding policies and actions that encourage adoption of
social commerce. |
Keywords: |
Electronic Commerce; Social Commerce (SC); Small And Medium Sized Enterprises
(Smes); Conceptual Framework. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
THE CIPP-SAW EVALUATION MODEL DESIGN IN MEASUREMENT THE EFFECTIVENESS OF
E-LEARNING AT HEALTH UNIVERSITIES IN BALI |
Author: |
I PUTU WISNA ARIAWAN, MADE KURNIA WIDIASTUTI GIRI, DEWA GEDE HENDRA DIVAYANA |
Abstract: |
This research aims to provide information about the design of an evaluation
model that is suitable to be used to evaluate the level of effectiveness of the
application of e-learning in health colleges in Bali. The evaluation model
intended is called CIPP-SAW, where this model is a combination of the CIPP
(Context-Input-Process-Product) evaluation model with SAW (Simple Additive
Weighting) method. The CIPP-SAW model is capable of presenting accurate
calculation results in determining evaluation aspects which are classified based
on preference values ranging from the lowest to the highest value. This research
was carried out with a development approach used the Borg and Gall method which
focused on three stages, including: 1) design development; 2) initial trial; and
3) initial trial revision. Subjects that were involved in the stage of design
development and stage of initial trial revision were three people (all
researchers). Subjects that were involved in the stage of initial trial were 34
respondents (4 knowledge experts and 30 lecturers). The instrument that was used
in data collection was a questionnaire consisting of 10 question items. The
analysis technique was used was the quantitative descriptive, with the results
of the analysis showed the evaluation model design that was classified as good
with an effectiveness percentage was 88.00%. |
Keywords: |
CIPP, SAW, Evaluation Model, E-Learning, Health Universities |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
A REVIEW OF ACADEMIC PERFORMANCE FACTORS IN THE CONTEXT OF E-LEARNING: THEORIES
AND EMPIRICAL STUDIES |
Author: |
MBANGATA L., EYONO OBONO S.D. |
Abstract: |
There are currently more than 500 commercial e-Learning software packages and
300 educational e-Learning software packages, and their benefits to their users
are well documented. For instance, with e-Learning, the rigidity of teaching and
learning timetables can be overcome. Shockingly, high education still witnesses
intolerable levels of academic failure even in this e-Learning era. This study
will therefore attempt to examine the impact of e-Learning on academic
performance based on the perceptions of academics. It is a literature review of
thirty-four (34) studies. Its findings reveal that according to the perceptions
of academics, the impact of e-Learning on academic performance depends on: the
demographics of the learners; their intensity of use of e-Learning; their
self-efficacy and learning approach; their sense of community and interactivity;
their perceptions on the suitability of e-Learning; and their motivation and
pride. This study also recommends more research on the validation of its
proposed theoretical framework, and its identified factors, because the reviewed
literature is not unanimous on their perceived effect on the impact of
e-Learning on academic performance. The main contribution of this study is to
broaden the scope of academic performance factors in the context of e-Learning
compared to similar past reviews with a limited pre-defined scope of academic
performance factors. |
Keywords: |
e-Learning, Academic Performance, Learning Theories, Theoretical Model,
Literature Review |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
AN EFFICIENT DEEP LEARNING FRAMEWORK FOR PEDESTRIAN DETECTION |
Author: |
HOANH NGUYEN |
Abstract: |
Vision-based pedestrian detection has achieved a large successful with the fast
development of deep convolutional neural network (CNN). However, due to
difficult environments such as large-scale variation, heavy occlusion and small
scale of pedestrian, recent deep CNN-based approaches for pedestrian detection
still do not achieve very good accuracy over public benchmark dataset. In this
paper, an efficient framework based on deep CNN for pedestrian detection is
proposed, especially for small-scale pedestrians. The proposed method first uses
the reduced ResNet-34 architecture for generating convolution feature maps.
Then, deconvolutional modules is used after the base convolution layers to bring
additional context information which is more effective to detect small-scale
pedestrians. In the region of interest pooling process, different feature maps
at different scales are used to produce high quality region proposals.
Furthermore, a modified loss function based on cross entropy loss is used to
increase the loss contribution from hard-to-detect small-sized pedestrians.
Experiment results demonstrate the effectiveness of the proposed approach with
good detection performance over the Caltech pedestrian dataset. |
Keywords: |
Pedestrian Detection, Convolutional Neural Network, Intelligent Transportation
Systems, Object Detection, Deep Learning |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
QUALITY OF SERVICE-BASED RESOURCE ALLOCATION FOR WEB CONTENT DELIVERY ON CLOUD
COMPUTING INFRASTRUCTURE |
Author: |
OMOTUNDE, AYOKUNLE A , IDOWU, SUNDAY KUYORO, ‘SHADE , AYANKOYA, FOLASADE Y ,
JOSHUA, JONAH V , ADEGBENJO, ADERONKE , ABEL, SAMUEL B. |
Abstract: |
Demand for web content continues to increase at exponential rates and this has
intensified the challenges of satisfying customer’s Quality of Service. Several
techniques for Web content delivery vis-à-vis resource allocation have been
proposed, one of which is the use of Content Distribution Networks. However, in
recent times, cloud computing has become a driving force in Information
Technology where Service Providers’ limited resources are shared among numerous
users with different QoS requirements. In this work, focus is on developing a
model for allocation of resources on cloud computing Infrastructure in order to
improve delivery of Web content and optimize service cost. An analytical
approach was adopted and expressed as an optimization problem subject to QoS
metrics: delay, throughput, and bandwidth. The optimization problem was
formulated as an Integer Linear Programming problem in which the decision
variable takes the value of 0 or 1. A single Infrastructure-as-a-Service cloud
with Virtual Machine (VM) instances running in Physical Machines (PM) was
assumed. The model was considered for different values of delay, throughput, and
bandwidth for each VM to obtain minimum cost of delivering Web content to users.
An algorithm was developed and sample data were collected from Amazon Elastic
Cloud Compute/storage pricing model to obtain optimal results. The
implementation of the algorithm was done using ‘A Mathematical Programming
Language/Modular In-core Nonlinear Optimization Systems’ (AMPL/MINOS). |
Keywords: |
Quality of Service, Resource allocation, Web content, Web content delivery,
cloud computing |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
EFFECT OF CLUSTERING DATA IN IMPROVING MACHINE LEARNING MODEL ACCURACY |
Author: |
SAMIH M. MOSTAFA , HIROFUMI AMANO |
Abstract: |
Supervised machine learning algorithms consider the relationship between
dependent and independent variables rather than the relationship between the
instances. Machine learning algorithms try to learn the relationship between the
input and output from the historical data in order to attain precise predictions
about unseen future. Conventional foretelling algorithms are usually based on a
model learned and trained from historical data. The instances in the historical
data may vary in its characteristics. The variation may be a result of
difference in case's pertinence degree to some cases compared to others.
However, the problem with such machine learning algorithms is their dealing with
the whole data without considering this variation. This paper presents a novel
technique to the trained model to improve the prediction accuracy. The proposed
method clusters the data using K-means clustering algorithm, and then applies
the prediction algorithm to every cluster. The value of K which gives the
highest accuracy is selected. The authors performed comparative study of the
proposed technique and popular prediction methods namely Linear Regression,
Ridge, Lasso, and Elastic. On analysing on five datasets with different sizes
and different number of clusters, it was observed that the accuracy of the
proposed technique is better from the point of view of Root Mean Square Error
(RMSE), and coefficient of determination 〖(R〗^2). |
Keywords: |
Prediction accuracy, K-means, clustering, regression, machine learning
algorithms. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
UNCERTAIN INPUT SELECTION MODEL FOR NEURON |
Author: |
ZULFIAN AZMI, MUHAMMAD ZARLIS, HERMAN MAWENGKANG, SYAHRIL EFENDI |
Abstract: |
The application of Artificial Neural Network model does not provide optimal
results in learning with the large quantity of inputs and real time. The input
stated in matrix with plenty of quantity makes the process in pattern
recognition getting slow. A model is required to minimize input for training and
to recognize patterns faster. Input recognition is required to know the special
characteristics of inputs which may represent all inputs using a model. In
addition, input recognition is necessary to know the inputs with the binary
value not only 1 and 0, but it may be with the value in between. To know
uncertain inputs, it is conducted by determining the degree of membership of
each variable. And, for each selection of input, it can be done by declaring it
as a row vector and by calculating the euclidean distance between each row
vector. Furthermore, the selected input may represent input for training.
Training is carried out with input variables consisting of dissolved oxygen,
water pH, salinity and water temperature to determine the quality of water. With
the model algorithm which is called as Uncertain Input Selection Model for
neurons, it helps to accelerate training in the system to determine the water
wheel can rotate. |
Keywords: |
Selection, Input, Uncertain, Euclidean |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
AN EVOLUTIONARY APPROACH FOR SOFTWARE DEFECT PREDICTION ON HIGH DIMENSIONAL DATA
USING SUBSPACE CLUSTERING AND MACHINE LEARNING |
Author: |
SUMANGALA PATIL, A.NAGARAJA RAO, C. SHOBA BINDU |
Abstract: |
Since last decade, due to increasing demand, huge amount of software is being
developed, whereas the data intensive applications have also increased the
complexity in these types of systems. Also, during the development process,
software bugs may severely impact the growth of industries. Hence, the
development of bug free software application is highly recommended in the
real-time systems. Several approaches have been developed recently that are
based on the manual inspection but those techniques are not recommended for huge
software development scenario due to maximum chances of error during manual
inspection. Thus, machine learning based data mining techniques has gained huge
attraction from researchers due to their analyzing and efficiently detect the
defect by learning the different attributes. In this work, we present machine
learning based approach for software defect prediction. However, software defect
datasets suffer from the high dimensionality issues, thus we present a novel
subspace clustering approach using evolutionary computation based optimal
solution identification for dimension reduction. Later, Support Vector Machine
Classification scheme is implemented to obtain the defect prediction
performance. Proposed approach is implemented using MATLAB simulation tool by
considering NASA software defect dataset. A comparative study is presented which
shows that proposed approach achieves better performance when compared with the
existing techniques. |
Keywords: |
Software Defect Prediction, Evolutionary Computation, Subspace Clustering, High
Dimensional Data, Machine Learning |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
A REVIEW OF CONSTRUCTIVE INTERFERENCE FLOODING IN WIRELESS SENSOR NETWORKS |
Author: |
HUDA A. H. ALHALABI , TAT-CHEE WAN |
Abstract: |
Constructive interference is a promising concurrent transmission technique for
multiple senders concurrently transmitting the same packet in wireless sensor
networks. It enables reliable and fast network flooding in order to decrease the
scheduling overhead of MAC protocols, improve link quality of lossy links,
achieve accurate time synchronization, and to realize efficient data collection.
This paper discusses the concept of constructive interference, its importance,
pre-conditions and open issues and challenges that should be considered by the
researchers. This paper delivers the knowledge about constructive interference
in WSN as a literature review, to get more knowledge about this emerging
technique. |
Keywords: |
Constructive Interference, Wireless Sensor Networks, Flooding, Concurrent
Transmissions, Synchronization |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
MODEL FOR ESTIMATING BUS ARRIVAL TIMES BY COMPARING VARIOUS CLASSIFICATIONS |
Author: |
HAEKAL MOZZIA PUTRA, MOHAMMAD FAIDZUL NASRUDIN |
Abstract: |
The availability of reliable and precise bus information such as bus arrival
times renders public transportation more attractive. It helps passengers plan by
reducing their waiting times. This paper aims to develop an estimated time of
arrival (ETA) model that is based on a comparison of various classifications and
groupings of real-time bus tracking data. The empirical analysis results
demonstrate that the prediction accuracy differs across methods, even using the
same dataset. The methodology consists of three stages: literature review and
identification of existing problems; development of an ETA model; and testing
and comparison of models. Data are obtained from a mobile bus tracking
application, namely, BasKita, in Universiti Kebangsaan Malaysia (UKM). Data such
as the route ID, bus stop ID, distance, day, time interval and log time are used
as features. Groupings of data are suggested, such as daily data, data by the
path and the complete data set. In this paper, linear regression (LR),
artificial neural network (ANN) and sequential minimum optimization regression
(SMOreg) are used to develop the model. The performances of ETA models are
compared via the correlation coefficient (CC) method and in terms of the root
mean square error (RMSE) and the mean absolute error (MAE). This work uses
moving average (MA) technical analysis on the data to reduce the estimation
error. The results that are obtained using the ANN method with daily grouping
and using MA as a feature are the most accurate. The results of this study
contribute to the development of an ETA model that can achieve satisfactory
accuracy to increase the quality of the bus service. |
Keywords: |
Estimated Time of Arrival, Machine Learning, UKM |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
VICTIM LOCALIZATION USING DARWINIAN IMPERIALIST COMPETITIVE ALGORITHM |
Author: |
AHMAD AL_QEREM |
Abstract: |
Grid search for the victim in unknown environments using a swarm of robots tend
to be not practical. Many research has been done using evolutionary algorithms
for victim localization. The search problem is modelled as an optimization
problem. The convergence to the optimal solution is one of the key factors in
such a problem. Due to the harsh and unpredictable nature of the environment,
cooperative navigation and victim identification in these environments are
incredibly challenging to achieve. The obstacle avoidance, collision of robots
and intra communication between them are still needed to assist the victim
localization process in unpredictable situations. In this study, the Robotics
Darwinian Imperialist Competitive Algorithm (RDICA) algorithm has been used for
victim localization, to avoid a local optima problem, the exploration facility
using do-revolution feature has been used. According to multiple experiments
that conducted under different scenarios, distinctive results have been
revealed. The average of crashed robots significantly reduced. Moreover, a 7.3
and 6.93% achieved improvement over the well-known state of the arts DPSO and
RDPSO algorithms respectively while preserving the comparable number of
iterations adequate to gain these results. |
Keywords: |
Imperialist Competitive Algorithm Victim Localization Swarm Robotics
Do-Revolution |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
SYSTEMATIC REVIEW OF DATA QUALITY RESEARCH |
Author: |
M.IZHAM JAYA, FATIMAH SIDI, LILLY SURIANI AFFENDEY, MARZANAH A. JABAR, ISKANDAR
ISHAK |
Abstract: |
Data quality drawn a major concern when dealing with data especially in the
event that insightful outputs is needed. Research in data quality emerged in
various topics and diversification in known knowledge and used approach is
inevitable. In this paper, we apply systematic review study to explain the
landscape of data quality and to identify available research gap by using
categorization and mapping. Our search scope is limited to research articles
from journals, conference proceedings and magazine published between 2010 until
2016. We defined three types of main categorization to map the selected research
articles and to answer our research questions. These categorization focus on
research topics, research type and contribution type. On average, fifty-four
research articles related to data quality were published every year. This number
shows the importance of data quality research in various research topics such as
online users, database, web information, sensors and big data. This study also
indicates that almost half of the selected articles proposed a novel solution or
an essential extension of an existing data quality technique. Moreover, most of
the selected research articles belongs to the model type in the contribution
category. Our mapping also suggests that obvious contribution disparity happen
between contribution in metric type and model type category. |
Keywords: |
Data Quality, Information Quality, Systematic Review |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
MALWARE VISUALIZATION TECHNIQUE: A SYSTEMATIC REVIEW |
Author: |
ABDALRAHMAN ALFAGI , AZIZAH ABD MANAF, AZIDA ZAINOL, ALAA ABDULSALAM ALAROOD |
Abstract: |
Recently, there has been a massive increase in number of malware types which
poses a severe threat to smart devices and to internet security. Thus, different
techniques have been applied to detect, classify and identify malware. Among
those techniques, visualization becomes the most attractive and popular.
Visualization techniques have been applied to view static data, monitor network
traffic or managing networks to detect and visualize the behavior of the
malware. Addressing malware visualization techniques are of prime importance for
protecting smart devices, monitoring network traffic or securing internet and
digital resources. Although there are some literature review papers on malware
detection techniques, none of them are addressed in a Systematic Literature
Reviews (SLR) which details a range of related work, provides a systematic and
rigorous approach to illustrate the current trend of malware detection
techniques. In contrast, this paper followed general guidelines for conducting
SLR to illustrate the malware visualization technique and its applications,
statistically showing the most common malware types and extracted features that
used to identify the malware. In this paper, an advanced search has been
performed in most relevant digital libraries to obtain potentially relevant
articles published until the end of 2016. About 80 primary studies (PSs) have
been identified based on inclusion and exclusion criteria. The analytical study
is mainly based on the PSs to achieve the papers’ objectives. The results
illustrate the importance of visualization techniques and which are the most
common malware as well as the most useful features. |
Keywords: |
Malware Detection, Malware Visualization, Malware Visualization Technique,
Systematic Literature Review, Malware Classification |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
THE ADVANCES OF INFORMATION TECHNOLOGY GOVERNANCE IN UNIVERSITIES: A SYSTEMATIC
REVIEW |
Author: |
ALEJANDRA OÑATE-ANDINO1 , DAVID MAURICIO |
Abstract: |
This study analyzes the advances of information technology governance (ITG) in
the university context, through a systematic review of the literature. The
following research questions are addressed: What progress has been made? What
implementation studies have been developed? What factors influence success? What
models have been created? And lastly, what are the relevant case studies? In the
period from 2006-2017, 17 studies reported advances in the five areas of IT
governance, 9 studies referred to aspects related to implementation, zero
studies were identified that analyzed critical success factors, 6 studies
developed ITG models and 34 studies analyzed implementation initiatives. Our
analysis of the relevant studies shows that the advances of ITG in the
university context are still incipient. |
Keywords: |
Systematic Literature Review (SLR), IT Governance, Information Technology
Government; Universities; Information Technologies Government Systems; Higher
Education.
|
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
A REVIEW OF DIGITAL BREAST TOMOSYNTHESIS HARD-WARE TECHNIQUES |
Author: |
ABUAJELA ARHUMA GHUMEDH , S.MASHOHOR , W.A.W ADNAN , ROZI MAHMUD |
Abstract: |
Digital Mammography (DM) technique is a well-rooted mode of imaging for early
breast cancer detection and diagnosis. After the introduction of digital imaging
in the field of radiology, several progressive meth-odologies have been
developed, specifically tomographic imaging methodologies that are capable of
cap-turing intricate details. The three dimensional (3D) Digital Breast
Tomosynthesis (DBT) is one of such methodologies which has witnessed extensive
penetration in clinics and possesses the capabilities to replace DM for the
screening of breast cancer in the future. A lot of pre-acquisition processes of
DBT influence its performance clinically. Therefore, this research focuses on
the comprehensive review of the DBT system hardware design, the X-ray supply
geometry optimization, radiation dose for breast glandular tissue, X-ray image
scatter and breast compression minimization. |
Keywords: |
Digital Breast Tomosynthesis, Acquisition Geometry, Image Acquisition, Radiation
Dose, X-Ray Scatter. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
SDCM: SECURE DYNAMIC END-TO-END CONGESTION AVOIDANCE PROTOCOL FOR MANETS |
Author: |
RUSHDI A. HAMAMREH |
Abstract: |
Ad-hoc Mobile Networks (MANETs) are vulnerable to various attacks. In addition,
congestion can occur due to limitation in resources and lead to high packet
loss, long delay and waste of resource utilization time. The major objective of
congestion control is to best utilize the available network resources by keeping
the load below the capacity. The great demand for capacity, place particular
emphasis on congestion management approaches. Recently, researchers have
developed many effective and well-studied algorithms for congestion control
within Transmission Control Protocol (TCP) to improve its performance over MANET
environment. TCP is designed to be reliable and ensure end to end delivery in
wired network. However, each existing TCP variant over MANET has its weaknesses
and strengths when changing MANET factors like: node mobility, traffic loads,
network size and wireless channel conditions. In this paper a new approach to
decrease packet loss using secure and dynamic path congestion estimation. The
simulation results show an improvement in TCP performance and security over
MANET in different scenarios. |
Keywords: |
TCP-VEGAS; TCP-WESTWOOD; TCP-WELCOME; TCP-DCM; MANET; Congestion; Link Failure;
Signal Loss, RTO. |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
MODIFIED FAST GRAY LEVEL GROUPING APPROACH FOR ENHANCING IMAGE CONTRAST |
Author: |
OSAMA HOSAM , HANI M. IBRAHIM ; MAHMOUD ELMEZAIN1; SAHAR SHOMAN2 & ALY MELIGY |
Abstract: |
Images with low contrast may have intensity properties that make it difficult to
use the traditional algorithms and approaches to enhance its contrast. Image’s
histogram may contain high frequency values in specific area in the image and
low frequency in the remaining area. That leads to inconsistency in the
histogram and results in image with unacceptable contrast. Two algorithms are
proposed to solve this problem, namely MFGLG and ACEIF/MFGLG approaches. The
first approach is Modified Fast Gray Level Grouping (MFGLG). It is a
modification of Fast Gray Level Grouping (FGLG). MFGLG uses two sets of gray
level bins and uses them as a startup bins assigned to the histogram. The
proposed approach results in more efficient contrast enhancement compared to
FGLG. The algorithm has no user intervention. Moreover, the proposed algorithm
can be applied to vast range of contrast perturbed images. It also can be
applied to images with the highest frequency in the histogram concentrated in
any image location. The second approach is Automatic Contrast Enhancement Image
Fusion based on Modified Fast Gray Level Grouping (ACEIF/MFGLG). With
ACEIF/MFGLG, the output of MFGLG is fused with the original image to get more
detailed image. The original image is used to provide more accurate contrast and
intensity. The proposed two approaches are applied on low-contrast images and
provides high quality images. The algorithm can be applied without human
intervention with acceptable speed compared to the other methods in
literature.Keywords: Image Fusion, Histogram, Contrast Enhancement, Image
Enhancement |
Keywords: |
|
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
Title: |
A METAHEURISTIC APPROACH FOR STATIC SCHEDULING BASED ON CHEMICAL REACTION
OPTIMIZER |
Author: |
Over the past several decades, scheduling has emerged as an area of critical
research, thereby constituting a requisite process for myriad applications in
real life. In this regard, many researchers have experimented and utilized
various optimization algorithms to obtain optimized schedules. It is also
noteworthy that the concepts of some optimization algorithms are essentially
derived from nature. This paper aims to augment a compiler using a chemical
reaction optimizer in order to identify an optimized instructions static
schedule capable of being used within both single and multicore computer
systems. This scheduling algorithm, which is denoted as SS-CRO (static
scheduling using chemical reaction optimizer), is unique in that it provides
alternative schedules involving different costs. Subsequently, SS-CRO tests the
schedules in accordance with different types of instructions dependencies before
making an appropriate selection. SS-CRO demonstrates that it can not only
provide different schedule orders, but also make a competent selection of
accepted solutions, whilst dismissing the inappropriate ones in a reasonable
span of time. So, this paper presents SS-CRO algorithm that is used to obtain an
optimized static scheduling, where SS-CRO has been implemented and evaluated
analytically and experimentally. As analytical results, the number of steps for
the SS-CRO approximately is O(Num_iteration×CROFun), where CROFun is the number
of steps of the selected function. In the experiments results, SS-CRO achieved
better execution time and higher accepted solutions in comparison with other
optimization algorithms such as; SS-DA (static scheduling using duelist
algorithm) and SS-GA (static scheduling using genetic algorithm). Furthermore,
SS-CRO achieved the maximum percentage of number of solutions with respect to
the execution time of all experiments for all proposed input cases, which is
ranged as (10%-30%). |
Abstract: |
OMAYYA MURAD, RIAD JABRI, BASEL A. MAHAFZAH |
Keywords: |
Chemical Reaction Optimizer, Compiler, Instruction Set, Metaheuristic Approach,
Static Scheduling |
Source: |
Journal of Theoretical and Applied Information Technology
15th November 2019 -- Vol. 97. No. 21 -- 2019 |
Full
Text |
|
|
|