|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Information Technology
June 2018 | Vol. 96
No.12 |
Title: |
FEATURE SELECTION USING MODIFIED ANT COLONY OPTIMIZATION APPROACH (FS-MACO)
BASED FIVE LAYERED ARTIFICIAL NEURAL NETWORK FOR CROSS DOMAIN OPINION MINING |
Author: |
DR.E.CHANDRA BLESSIE, S.GNANAPRIYA |
Abstract: |
Web mining and web usage mining are attracting many researchers to propose new
ideas, models, deploying machine learning algorithms and more. Internet usage
expands its wings to almost all kind of applications which includes e-commerce.
E-commerce facilitates the consumers/customers to buy the products online and at
the same time, web analytics helps the website administrators to identify which
products sell more. Opinion mining is the key to analytics in many
decision-making tasks in the e-commerce arena. This research work aims to
propose feature election using modified ant colony optimization approach
(FS-MACO) based five layered artificial neural networks for cross-domain opinion
mining. Dataset is obtained which consists of reviews about products such as
books, DVDs, electronics and kitchen appliances. The features are identified by
making use of modified ACO and opinion mining is performed by using ANN.
Accuracy and F-measure are the metrics chosen for the evaluating the performance
of the proposed work. Comparison of domain-specific and domain – dependent words
are presented. Results portray that the proposed work outperforms better than
that of the existing work in terms of the chosen performance metrics. |
Keywords: |
Opinion Mining, E-Commerce, Optimization, Neural Network, Accuracy, F-Measure. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
NEURAL NETWORK WITH PSO BASED TRAINING TO DETECT SPOOFING WEBSITES |
Author: |
SOZAN ABDULLAH MAHMOOD, NOOR GHAZI M. JAMEEL, SHAIDA JUMA SAYDA |
Abstract: |
With the advent of Internet, various online attacks were increased among them
and the most well-known is a spoofing attack. Web spoofing is a type of spoofing
in which fake and spoofing websites made by means of fraudsters to duplicate
real websites. Spoofing websites represent legitimate websites which attract
users into visiting fake websites to steal users sensitive, personal information
or install malwares in their devices. The scammers will use the stolen
information for illegal purposes. The specific intention of this paper is to
build a new intelligent system that detects and recognize between trusted and
spoofing websites which try to mimic the trusted sites because it is very
difficult to visually recognize whether they are spoofing or legitimate. This
paper deals with the detection of spoofing websites using Neural Network (NN)
trained with Particle Swarm Optimization (PSO) algorithm. An Information gain
algorithm is used for feature selection, which was a useful step to remove the
unnecessary features and reduce time. The Information gain seems to improve the
classification accuracy via reducing the number of extracted features and used
as an input for training the NN using PSO. Training neural network using PSO
provides less training time and high accuracy which achieved 99.18% compared to
NN trained with back propagation algorithm which takes more time for training
and less accuracy which was 98.20%. The proposed technique is evaluated with a
dataset of 2500 spoofing sites and 2500 legitimate sites. The results show that
the technique can detect over 99.18% spoofing sites with NN trained using PSO. |
Keywords: |
Web Spoofing, Information Gain, Neural Network, Particle Swarm Optimization |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
FROM THE W3C GOOD SOCIAL ONTOLOGIES TO A UNIFIED SEMANTIC MODEL FOR OSN |
Author: |
ASMAE EL KASSIRI, FATIMA-ZAHRA BELOUADHA |
Abstract: |
The Online Social Networks (OSN) are networks resulting from social media users
and interactions. Exploiting the semantic web technologies to present the OSN
data, in this paper, we propose using the W3C recommended ontologies as good
ones named Friend Of A Friend (FOAF) ontology and Semantically-Interconnected
Online Communities (SIOC) ontology; to meet the interoperability, aggregation
and analysis needs, we propose also a FOAF and SIOC extensions to participate to
engineering a Unified Semantic Model (USM) for OSN. The Unified Semantic Model
will permit modelling the most popular social media. |
Keywords: |
FOAF, SIOC, SIOCA, SIOCT, exSIOCInt, exFOAF, exSIOCA, exSIOCT, Unified Semantic
Mode USM, OSN |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
EFFECT OF GAMIFICATION ON E-LEARNING TO SUPPORT LEARNING ACHIEVEMENT AND
LEARNING MOTIVATION |
Author: |
TOMMY PRASETYO AJI, TOGAR ALAM NAPITUPULU |
Abstract: |
Gamification is a use of game elements in contexts other than games to motivate
and enhance user activity. With the recent rapid use of gamification for
e-learning, an interesting open question for educators is, can it motivate and
improve the achievement of students? The purpose of this study is to measure the
effect of gamification on e-learning to support learning achievement and
learning motivation. This is done by comparing the traditional learning method
(class) and the method of e-learning gamification. The researcher will develop a
prototype of e-learning gamification to support his research. There are several
game mechanics conducted by researchers into e-learning such as, points, levels,
challenges, and leaderboards. The data of this research were taken from the
questionnaires distribution and the result of study report of 24 students in one
of junior high school. Questionnaires were made into two kinds,
pre-questionnaire and post-questionnaire respectively for traditional method and
e-learning gamification. While the learning reports are obtained from student’s
semester report and report during the use of e-learning gamification. The data
is processed with the help of software SmartPLS v.3.2.6 and Microsoft Excel 2016
and analyzed using paired t-test. The results obtained are e-learning
gamification does not give a positive effect or improvement on student
motivation in terms of behavioral, emotional and cognitive. Futhermore,
e-learning gamification also does not give a positive effect on improving
student's learning achievement. The role of teacher and one time use of
e-learning gamification are the causes that e-learning gamification does not
give a positive impact. The challenge for further research is, how we substitute
the teacher role. |
Keywords: |
Gamification, E-Learning, Learning Motivation, Learning Achievement, Game
Mechanics |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
TEXT CRYPTOGRAPHY USING MULTIPLE ENCRYPTION ALGORITHMS BASED ON CIRCULAR QUEUE
VIA CLOUD COMPUTING ENVIRONMENT |
Author: |
KHALID KADHIM JABBAR, HUSSIN ABD HILAL, RANA SAAD MOHAMMED |
Abstract: |
The tremendous development of communication technology, it has become necessary
to use cloud computing systems that help us to store the data within a virtual
structure, with the increasing volume of important data, the need to secure this
data has become necessary through the use of diverse and complex techniques and
methods to ensure integrity, confidentiality, and security. This paper presented
a method to encrypt a message with different sizes in cloud computing
environment by using several encryption algorithms such as: Advanced Encryption
Standard AES, Advanced Encryption Standard RSA, and Advanced Encryption Standard
Proposal AESP to make the method more secure and effective. which controlled by
circular queue that is responsible for scheduling the implementation of the
algorithm that defined by a secret code generated by Control Key, this a secret
code is changed each time and with each message to be decoded later by using the
same secret code to infer the algorithm that used in the encryption, in addition
to the possibility of generating multiple random keys that are vary according to
the encryption algorithm used. The experiment results shows that the proposed
method has ability to encrypt and decrypt a text message with different sizes
and short time by utilizing the properties of several algorithms that are
scheduled on circular queue inside the cloud computing system, the proposed
method consume time is less than other methods when it used alone, while the
important criteria such as integrity, complexity, usability, and security are
take into consideration to make the method more effective and efficient. |
Keywords: |
Integrity, confidentiality, RSA, AES, AESP. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
ENHANCING THE RETRIEVAL OF SOCIAL WEB-BASED E-LEARNING CONTENT USING SEMANTICS
EXTRACTED FROM DBPEDIA AND WORDNET ONTOLOGIES |
Author: |
AMMAR ALNAHHAS, BASSEL ALKHATIB, AHMAD OMAR |
Abstract: |
As E-learning Tools and techniques are becoming more common and compelling, many
researches have emerged lately that aims at making it more flexible and
applicable. Besides, the content is getting very large nowadays, so that it is
very important to develop a more accurate and robust search techniques that help
users find the best learning materials that exists all along the web specially
on social learning websites. In this paper we propose a new method to
collect, index and retrieve learning materials, a collection algorithm is
presented that can bring together content from various sources. We present a
semantic indexing method that aims at weighting words of the document based on
both DBpedia and Wordnet Ontologies, which proves more accurate results
according to the analysis and comparison that are shown in this paper. |
Keywords: |
E-learning, DBpedia, WordNet, Search Engine, Social web |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
ENHANCEMENT OF MAXIMAL RATIO COMBINER IN ULTRA-WIDEBAND WIRELESS RAKE RECEIVER |
Author: |
JAAFAR A. ALDHAIBAINI, MOHANAD S. ALKHAZRAJI, NAEL A. AL-SHAREEFI |
Abstract: |
For several users' existence, there are much internal or external interference
such as inter-symbol interference (ISI) obstacles act during multiple paths
reception technique for receiving data at high data rates in "ultra wideband
(UWB) technology". For these several reasons, UWB wireless rake receiver
technique was designed to combine many received resolvable paths, normally
greater than 100 paths. Structures of rake receiver implementation in literature
review have been structured, proposed, and designed towards increasing the
system complexity. These designs were done to get better performances through
outdoor or indoor reception of multi-path propagation to reduce the probability
of bit error (Pe) or bit error rate (BER). So, many rake receivers have been
designed in the last publications to decrease the number of resolvable paths
that must be estimating and combining by combiner. To achieve this main goal, we
proposed two structures of combiner in the rake receiver, Signal Separation
Maximal Ratio Combiner Partial Rake (SS-MRC-PR) receiver and Signal Separation
Maximal Ratio Combiner Selective Rake (SS-MRC-SR) receiver. These proposed
combiners are based on separation of received signals depending on their signs
to make comparison between the number of negative signs and positive signs. The
previous receivers were produced to reduce the system complexity with fewer
correlators and to improve the system performance with very low Pe at increased
signal-to-noise ratio (S/N). Before the decision circuit, a comparator is used
to make comparison between two quantities, negative quantity and positive
quantity to decide whether the wireless transmitted bit is 0 or 1. The Pe was
simulated by "MATLAB simulation software" for wireless multi-path environments
of "impulse radio time-hopping binary phase shift keying (TH-BPSK) modulation"
and the simulation results were obtained and compared with those of
"Conventional Maximal Ratio Combiner Partial Rake (C-MRC-PR)" receiver and
"Maximal Ratio Combiner Selective Rake (C-MRC-SR)" receiver. This comparison
show the reducing of Pe against S/N for the proposed receivers compared to that
for conventional receivers which supports the proposed designs. |
Keywords: |
Rake Receiver Structures, SS-MRC-PR and SS-MRC-SR, Sign Separation of Signals,
Conventional MRC, Pe and BER |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
NEW MICROSTRIP FILTER FOR MIMO WIRELESS AND COMPUTER SYSTEMS |
Author: |
Aqeel H.Al-fatlawi, Mohammed A.Taha |
Abstract: |
This paper proposes new microstrip bandpass filter to prevent the image bands as
far as possible for MIMO antenna configurations for IEEE 802.11 b wireless
networks. The insertion loss and return loss values of the projected filter are
0.4 dB and 26.4 dB correspondingly that are very applicable to be implemented
with MIMO antenna configuration. This enhanced MIMO prototypical structure
can significantly develop wireless scheme by using the multipath propagation
productively. These enhancements can be utilized to augment the dependability of
instantaneously communicated data streams, improve the wireless system capacity
(multiplexing gain) and lessen the bit error rate of wireless system. Along with
filter frequency response, the augmentations are investigated in terms of phase
response, group delay and Bit Error Rate (BER) for various antenna
configurations using QPSK digital technique. |
Keywords: |
MIMO, Microstrip Bandpass Filter, Wireless and Computer Networks, IEEE 802.11 b,
Bit Error Rate |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
IMAGE STEGANOGRAPHY BASED THE BEHAVIOR OF PARTICLE SWARM OPTIMIZATION |
Author: |
INAAM RABAH MOHAMMAD, Dr.ZIYAD TARIQ MUSTAFA AL TAI |
Abstract: |
The growing possibilities of modern communications and the explosive growth of
information technology imposed special means of security. Image steganography is
one class of this security. However, in the modern world new techniques are
appeared continuously such as swarm intelligence. Therefore, this work presents
an enhancement of security in the field of image LSB steganography using the
behavior of PSO algorithm. Specifically, SPSO algorithm represents the standard
behavior of PSO, QPSO algorithm represents the quantum behavior of PSO and HPSO
algorithm represents the human behavior of PSO. In this paper, three image
steganography techniques are proposed using (SPSO, QPSO, and HPSO) algorithms.
These algorithms are used to determine the best locations in the cover image
pixels to embed the secret text message using LSB technique. Therefore, this
paper presents a comparison and evaluations between these types of PSO
algorithms in the field of image steganography. Experimental results for hiding
different size text messages in four BMP cover images with different sizes,
prove that the proposed image steganography using QPSO has best performance.
However, PSNR of QPSO system stegocovers is (81.709 dB), the PSNR of HPSO system
stegocovers is (81.143 dB), and the PSNR of SPSO system stegocovers is (81.012
dB). Hence, the proposed method using quantum behavior of PSO algorithm is the
best performance system from steganography point of view. |
Keywords: |
PSO, SPSO, QPSO, HPSO, LSB. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
SEARCHING OVER ENCRYPTED SHARED DATA VIA CLOUD DATA STORAGE |
Author: |
SAMEEH ABDULGHAFOUR JASSIM, WALEED KAREEM AWAD |
Abstract: |
Cloud computing has developed from various technologies such as autonomic
computing, virtualization, grid computing, and other technologies, and the
secure storage is essential and important for it due to it provides virtualized
resources on Internet. Therefore, Data owner must encrypt his documents locally
before uploading it in the public cloud storage to prevent unauthorized access
to his data. Sometimes, the data owner wants to share some of his encrypted
documents that stored in the cloud with other authorized users, so, he must send
the secret key for each document for all authorized users, but this way has many
limitations due to the difficulty of key management and key distribution. To
overcome drawback of this approach we proposed system to generate a single key
used for multiple number of documents and users depending on two techniques
asymmetric cryptography and symmetric cryptography. Asymmetric cryptography used
the IBC of the data owner to generate his private key and split the private key
into two parts give one part to all authorized users and the other part send to
Semi-Trusted Third Party (STTP). While symmetric cryptography used by combined
secret key with the encrypted file properties and decrypted the result by the
public key of the data owner by using asymmetric cryptography (RSA algorithm).
Finally, many results were obtain from implementing the proposed system, among
these results; the data owner could add or revoke any user without change the
master secret key, also the data owner not need to share multi keys with
authorized users. As well as, the system overcame the problem on difficulty of
searching over encrypted data through encryption key in a public cloud. |
Keywords: |
Cloud Computing, Cryptography, Revocation, Virtualization, Homomorphic. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
THE EFFECT OF ATTRIBUTE DIVERSITY IN THE COVARIANCE MATRIX ON THE MAGNITUDE OF
THE RADIUS PARAMETER IN FUZZY SUBTRACTIVE CLUSTERING |
Author: |
MARJI, SAMINGUN HANDOYO, IMAM N. PURWANTO, M. YUYUD ANIZAR |
Abstract: |
The Fuzzy Subtractive Clustering (Fsc) method is applied in many fields because
it is able to produce optimal clusters without requiring initial information of
many groups as well as on the k-mean method. Unfortunately, in the Fsc method,
there is a radius parameter that has a vital role in generating optimal
clusters. The magnitude of this radius parameter is hypothetical to be
influenced by the variability of the covariance matrix of the dataset. This
study investigates the magnitude of radius parameter that resulted in optimal
clusters on three datasets with high variability (dataset1), moderate
variability(dataset2), and low variability(dataset3) on covariance matrices. In
the clustering process, the squash factor and accept ratio parameters are made
in constant, while the radius parameter is the determined variable that leads to
the optimal cluster achievement. Clustering results are said to be optimal based
on two criteria: each cluster consists of at least 2 members, and the clustering
produces the smallest Ctm value. The results of this study recommend that prior
to clustering with Fsc, it should be calculated first covariance matrix based on
the standardized dataset. If the covariance matrix has a high variability, the
radius value used is close to 1, the moderate variability is a radius value of
about 0.5, whereas the low variability is used near the 0 radius value. |
Keywords: |
Covariance Matrix, Fuzzy Subtractive Clustering, Optimal Cluster, Radius
Parameter, Variability |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
AUTOMATIC MACHINE LEARNING TECHNIQUES (AMLT) FOR ARABIC TEXT CLASSIFICATION
BASED ON TERM COLLOCATIONS |
Author: |
FEKRY OLAYAH, WASEEM ALROMIMA |
Abstract: |
Due to the rapid and increased availability of documents in a digital format,
effect for retrieving information with highest accuracy and the lowest error
rate is becoming more difficult. Text Classification (TC) has become one of the
key techniques for controlling and organizing documents based on the content of
documents. Therefore, keyword extraction is one of the most important natural
language processing applications, which extracts information from the document
such as term collocations, which are two or more words appear together and
always seem as associated. In Arabic language, there are many problems in
keyword extraction because of the complexity of Arabic orthography. Moreover,
the accuracy is affecting by the document content and the classification
technique used. The need for automatic text classification came from a large
amount of electronic documents on the web. This research aims to propose an
Automatic Machine Learning Techniques (AMLT) for classifying Arabic documents by
using term collocations. These collocations are mined from Arabic documents, the
extracted term collocations will scoring by using association measure and will
be used as terms feature selection. To achieve this study, we used Arabic
documents divided into four categories (Economy/ business, Politics, Religion
and Science). The results of our approach have compared with the full-document
approach and summary-document approach using four techniques (SVM, NB, J48, and
KNN) for Arabic documents to determine which classifier is more accurate for
Arabic text based on term collocation. The evaluation results proved that our
proposed approach outperforms the other method in accuracy. |
Keywords: |
Arabic Language, Text classification, Term collocations, bi-gram, Machine
Learning, Category |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
THE EFFECTS OF INFORMATION QUALITY, SERVICE QUALITY, AND COMPATIBILITY ON
CONTINUE USE OF M-LEARNING AMONG STUDENTS IN THE PUBLIC UNIVERSITIES OF IRAQI |
Author: |
SALIH HAJEM GLOOD, RAED MOHAMMED HUSSEIN, WISAM ABDULADHEEM KAMIL |
Abstract: |
The success of information system could be determined based on continuously
using the technology for future benefits. In today’s era of technology,
M-Learning (ML) system has become an important component of Information and
Communication Technology provided by educational institutions such as
universities to facilitate all the transactions between institutions and their
students or lecturers. This system has gained the popularity in developed and
developing countries alike. Despite the known benefits of ML to the education
community, usage of ML services among students in the developing countries,
especially in Iraq, is still low. Additionally, there has not been empirical
study on the continuous usage of ML services in developing countries such as
Iraq. The primary objective of this study is to determine the contributing
factors for continuous usage of ML services among students in a higher education
of Iraq. Moreover, the study used quantitative approach by distributing 600
questionnaires to respondents in the public universities of Iraq. Besides that,
the collected data was analysed using a Structural Equation Model. The findings
showed that information quality construct has a large effect on user
satisfaction than service quality and compatibility constructs. |
Keywords: |
Mobile Learning, Continuous Usage, Evaluation IS Success, Higher Education. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
RECOGNITION OF COMPLEX HUMAN ACTIVITY USING MOBILE PHONES: A SYSTEMATIC
LITERATURE REVIEW |
Author: |
MOHAMMED MOBARK, SURIAYATI CHUPRAT, HASLINA SARKAN, MOHD NAZ'RI MAHRIN,
5NURULHUDA FIRDAUS MOHD AZMI, YAZRIWATI YAHYA |
Abstract: |
Using mobile phones for Human Activities Recognition (HAR) is very helpful in
providing a personalized support system for healthcare management and general
wellbeing of the user. Many studies have been published which have investigated
the HAR with the help of mobile phones. But, in these studies, the researchers
briefly mentioned the complex HARs and did not provide any discussion or
comparison with the models used. In our study, we have carried out a systematic
review of the currently used models in the Complex HAR. We have been carrying
out an automatic search in 4 digital libraries since 2012, to address four
research questions in our study. We found 11 primary studies after applying the
included - excluded criteria. Further studies need to be carried out in this
area, especially for solving the issue of a trade-off between the recognition
accuracy and the computational load. |
Keywords: |
Complex Activity Recognition, Mobile Phone Devices, Systematic Literature
Review, Composite Activity, Interleave Activity, Concurrent Activity. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
DETERMINING THE DOMINANT ATTRIBUTES OF INFORMATION TECHNOLOGY GRADUATES
EMPLOYABILITY PREDICTION USING DATA MINING CLASSIFICATION TECHNIQUES |
Author: |
KENO C. PIAD |
Abstract: |
Recent years shows major increase in the study with regards with prediction and
model discovery using various types of Educational Data Mining Techniques.
Classification is one of the several data mining techniques that has become an
interesting topic to the researchers because of its accuracy and efficiency for
classifying the data for knowledge discovery. The purpose of this paper is to
predict the employability of IT graduates by to determine whether the IT
graduates will land into an IT related profession or not related to IT based on
the CMO.53 Series of 2006.The study aims to determine the dominant attributes
using data mining algorithms under the supervised learning and compare their
accuracy. Among the classification techniques used for comparison of accuracy
were Naive Bayes, J48, Simple Cart, Logistic Regression and Chaid Algorithms.
The researcher collected the historical data from the five year profiles of BSIT
graduates from S.Y. 2011 to 2015 from the University Job Placement Office Tracer
Study and combine with their academic records. The results show that 3
significant factors that have direct effect on IT employability which includes
IT_Core, IT_Professional and Gender. |
Keywords: |
Data Mining, Decision Tree, Classification Algorithm, Employability, Prediction;
Analytics, Accuracy |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
SENDING IMAGE IN NOISY CHANNEL USING ORTHOGONAL FREQUENCY DIVISION MULTIPLEXING
SCHEME |
Author: |
GHASSAN MUSLIM HASSAN, KHAIRUL AZMI ABU BAKAR, MOHD ROSMADI MOKHTAR |
Abstract: |
Orthogonal frequency division multiplexing (OFDM) is dependable in data
transmission with high speed by the benefit of its robustness to multi-path
fading, high data rate, and high spectral efficiency. OFDM is a multi-carrier
modulation scheme. Synchronization is the major problems of the OFDM system. In
this research, the power consumption was considering through a noisy channel
when an image was transmitted in the OFDM system. To support the work, many
features were tested such as minimizing the complexity with fast Fourier
transforms/inverse fast Fourier transform (FFT/IFFT). The size of bandwidth (BW)
play the main role and how its effect on the transmission and how its related to
the FFT size (Nfft). The modulation type was also tested to see which one is the
best for image transmitted. These types are phase shift keying (PSK) and
quadrature amplitude modulation (QAM). In addition, signal to noise ratio (SNR)
is one of the performances in OFDM system and consider one of the factors which
wireless communication depends on related with bit error rate (BER). Another
drawback of the OFDM system was Peak-to- average power ratio (PAPR), therefore,
the effect of Nfft on the PAPR has been tested with simulation results in which
additive white Gaussian noise (AWGN) had been used in the MATLAB simulation. |
Keywords: |
Bit Error Rate, Fast Fourier Transform, Orthogonal frequency division
multiplexing, Peak-to- Average power ratio, Signal-to-Noise Ratio. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
ANALYSIS AND MODIFICATION OF RICE GOLOMB CODING LOSSLESS COMPRESSION ALGORITHM
FOR WIRELESS SENSOR NETWORKS |
Author: |
S.KALAIVANI, Dr.C.THARINI |
Abstract: |
Wireless sensor networks (WSN) are network that are constructed using number of
sensor nodes distributed and connected wirelessly to perform some specific
applications. These networks are strictly restricted in the usage of energy as
they use batteries with finite amount of power. This necessitates the need for
data compression at each sensor node in a WSN so as to overcome the resource
constraints of the network and increase its lifetime. Various functions carried
out by each node are sensing, processing, communicating and storing the data,
among which the communication consumes much energy than the other functions.
Data compression is one technique that extends the network lifetime by reducing
the energy consumed at each node during communication. The proposed work
comprises of 5 different methods that suggests modification of lossless Rice
Golomb Coding (RGC) compression algorithm with respect to the decision of
tunable parameter based on preprocessing of the input data. Simulation results
on various modifications of RGC using different datasets in MATLAB software is
analyzed and compared with respect to root mean square error (RMSE) and saving
percentage (SP). The modified lossless compression algorithm EMARGC_D with
better saving percentage is applied and executed in real time using NI WSN nodes
interfaced with LabVIEW. |
Keywords: |
Data Compression, LabVIEW, MATLAB, NI 3202 Sensing Node, NI 9792 Gateway Node,
Rice Golomb Coding, Wireless Sensor Networks |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
ENHANCEMENTS FOR CROWDSOURCED REQUIREMENTS ENGINEERING |
Author: |
SULTAN ALYAHYA, WEJDAN ALOHALI, SAMEERAH AL-BALHARETH |
Abstract: |
The use of crowdsourcing in software requirements engineering has become common
nowadays. The web-based services offered through Crowdsourced Requirements
Engineering (CRE) platforms support customers in finding timely and accurate
requirements specification for their proposed tasks and projects. Currently, the
literature does not have a critical assessment of the key activities involved in
the CRE platforms. In this paper, we review the process used in the CRE
platforms including identifying the workflow used in managing the process. Then
we made a step further in the direction of improving the current platforms; the
review is used to identify a set of limitations in the current process and has
led to propose enhancements. These enhancements are evaluated using two
techniques: questionnaire and workshop. The questionnaire shows that the
enhancements are sound and practical to be added to CRE platforms. In addition,
the evaluation through conducting a workshop showed that the participants were
satisfied with the enhancements but asked for further modifications. |
Keywords: |
Crowdsourced Requirements Engineering, Crowdsourcing, Requirements Engineering,
Enhancements, Platforms. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
A MEASUREMENT MODEL OF THE FUNCTIONAL SIZE OF SOFTWARE MAINTAINABILITY
REQUIREMENTS |
Author: |
KHALED ALMAKADMEH, KHALID T. AL-SARAYREH, KENZA MERIDJI |
Abstract: |
The European ECSS-E-40 standard for the aerospace industry includes
maintainability as one of sixteen non-functional requirements for the embedded
and real time software. The software maintainability requirements measured
internally and externally. According to the ECSS European standards,
maintainability requirements are apportioned to set maintainability requirements
for lower level products to conform to the maintenance concept and
maintainability requirements of the system and the maintainability analysis
shall identify the maintainability critical items. This paper propose a new
measurement model of the functional size of maintainability requirements of
software. This functional size of the maintainability requirements measured
using the concepts of the ISO19761: COSMIC standard at an early phase of the
software development life cycle. It used as one of the primary inputs for the
software effort estimation process. Further, this paper presents the design of
software standard etalon to help in development of software products more
effectively. An experiment is conducted to verify the applicability of the
proposed measurement model to measure the functional size of requirements
specifications of an online library software. |
Keywords: |
Maintainability Requirements, ISO19761, ECSS standards, Measurement method |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
DEVELOPING (UTAUT 2) MODEL OF ADOPTION MOBILE HEALTH APPLICATION IN JORDAN E-
GOVERNMENT |
Author: |
MALIK BADER ALAZZAM , YASSER MOHAMMAD AL SHARO , MAJED KAMEL AL AZZAM |
Abstract: |
Mobile Health Application (MHAs) positively influence the quality of hospital
services and lessen health costs. Lack of education and awareness may hinder the
use of the MHA system. Through the Unified Theory of Acceptance and Use of
Technology (UTAUT) framework, this research scrutinized the elements which
influence the MHA acceptance. The UTAUT model is a new tool for evaluating the
integration/adoption of MHAs. Through empirical studies and the use of the
UTAUT2 model we aim to clarify the behavioral exercise of healthcare specialists
and their propensity to using the MHA system. The objective of the research
is to test trust information elements which influence integration and adoption
of the MHA by healthcare providers. This research focuses on Jordanian hospitals
which use the MHA. The study further direct to create a well-defined view of
which factors influence the adoption and acceptance of the MHA, by making use of
questionnaires as a starting point for in-depth and comprehensive future
studies. Methods: The article made use of surveys coursed through Jordan
hospitals to gather data from healthcare providers/specialist/professionals, who
were familiar with the MHA system. A total of 278 responses were derived from
555 survey-forms for data analysis. |
Keywords: |
Mhas, Healthcare Professionals, UTAUT1, UTAUT2, Acceptance, Trust In
Information, E Government |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
MEAT FRESHNESS IDENTIFICATION SYSTEM USING GAS SENSOR ARRAY AND COLOR SENSOR IN
CONJUNCTION WITH NEURAL NETWORK PATTERN RECOGNITION |
Author: |
MUHAMMAD RIVAI, FAJAR BUDIMAN, DJOKO PURWANTO, JOSHWA SIMAMORA |
Abstract: |
Meat freshness level is an important factor to determine meat quality for
consumption. In this research, a sensor system has been designed to identify the
freshness level of meat in fast, precise and non-destructive manners. The system
is implemented into a Raspberry Pi equipped by gas and color sensors as the
freshness identifier tools to replace the human olfaction and vision in
determining a fresh meat. Pattern recognition powered by a neural network is
used to identify the meat’s freshness. The neural network inputs are the odors
sensed by the gas sensor array of MQ-136, MQ-137, TGS 2620 and Red, Green, Blue
values sensed by TCS 3200 color sensor. Three levels of freshness have been
tested, such as fresh meat, half-rotten meat, and rotten meat. The usage of the
three gas sensors and one color sensor of the system is capable to acquire a
distinct pattern for the three categories of freshness. The freshness
identification of the meat has a high percentage of success up to 80%. The
errors are caused by the small different of the pattern sensed by sensors for
half-rotten meat and rotten meat; these two kinds of meat fortunately are not
consumable. Thus, it may conclude that the system has 100% success degree to
identify fresh meat and non-fresh meat. The implementation of the system is
expected to replace the traditional measurement by the human senses (i.e. nose
and eyes) to obtain equal measurement as different human examiner acquires
different result, and to eliminate the impact of bacteria or virus from meats to
examiner. It may also replace measurement system using chemical substances so
the tested meat will be still consumable. |
Keywords: |
Color Sensor, Gas Sensor, Meat Freshness, Neural Network, Raspberry Pi. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
AN EFFECTIVE INTRUSION DETECTION MODEL BASED ON SVM WITH FEATURE SELECTION AND
PARAMETERS OPTIMIZATION |
Author: |
EL MOSTAPHA CHAKIR, MOHAMED MOUGHIT, YOUNESS IDRISSI KHAMLICHI |
Abstract: |
With the growth of the internet, network attacks have increased severely in a
substantial number in the last few years. Therefore, Intrusion Detection Systems
(IDSs) have become a necessary addition to the information security of most
organizations. An IDS monitors a network or a single host looking for suspicious
activity and reports them. Many intrusion detection types of research have
focused on the feature selection because some characteristics are irrelevant or
redundant which result in a lengthy detection process and degrades the
performance of IDS. For this purpose, we have used in this work an algorithm
based on Information Gain technique. This algorithm selects an optimal number of
features from NSL-KDD Dataset. In addition, we have combined the feature
selection with a machine learning technique named Support Vector Machine (SVM)
using Radial-basis kernel function (RBF) and a Particle Swarm Optimization
algorithm to optimize the parameters of SVM for effective classification of the
dataset. We have also compared the proposed method and other methods. Tests on
the NSL-KDD Dataset have proved that our proposed method can reduce the number
of features and obtain good results in terms of accuracy, attack detection rate
and false positives rate, even for unknown attacks. |
Keywords: |
Intrusion Detection System, NSL-KDD, Feature selection, PSO, SVM, Information
Gain. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
A GESTURE RECOGNITION SYSTEM FOR GESTURE CONTROL ON INTERNET OF THINGS SERVICES |
Author: |
TALAL H. NOOR |
Abstract: |
Internet of Things (IoT) is a promising computing model, which uses several
enabling technologies to provide new type of smart services that allow users to
interact with daily objects in a different way. Most of IoT services are invoked
using touch screens and connected to Smartphones or tablets to enable a
real-time connectivity between users and things. However, only a few IoT systems
consider gesture based user interactions for their services which allow users to
have a better experience with IoT products. In this work, we present a gesture
recognition system for gesture control on IoT services. In particular, the
gesture recognition system is based on alphabet characters and number to
classify the hand fingertip trajectory using a hidden Markov model. The
presented system composed of three key stages. Firstly, color information with
3D depth information that segment the correct position of hands. Then the
fingertip is detected based on the curvature of hand contour. Secondly, the
dynamic features in polar coordinates are employed using k-mean clustering
technique. Finally, Baum-welch procedure is used to carry out the learning
gestures and the viterbi algorithm is used to recognize the gesture. The
experiments demonstrate that the presented system has a capability to classify
the isolated gesture dues to spatio-temporal variability. Precisely, our
experiment provides a promising result where the average recognition rates
achieved 98.61% and 93.06% for training and testing dataset, respectively.
Furthermore, it provides finer presentation and low-slung computational
difficulty when functional image sequences samples of multipart circumstances
are used. |
Keywords: |
Internet of things' services, gesture recognition, gesture control, color
information, depth map, hidden Markov model. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
AN ANALYSIS OF TEXT MINING FACTORS ENHANCING THE IDENTIFICATION OF RELEVANT
STUDIES |
Author: |
MOUAYAD KHASHFEH, MOAMIN A. MAHMOUD, MOHD SHARIFUDDIN AHMAD |
Abstract: |
The development of science and the spread of knowledge coincide with growing
number of publications, and the volume of online content continue to grow at a
rapid rate. For some submitted queries, the search engines may return thousands
of documents of questionable relevancy. In this paper, we analyze the literature
and identify the text mining factors that influence the identification of
relevant studies. Five factors are identified which are Text Typography;
Paragraph length; Term Frequency factor; Coordination; and Strict search.
Subsequently, we propose an agent based-text mining model that facilitate the
identification of relevant studies in big databases. The model consists of four
components which are, interface, search process, parsing process, and storage.
The interface provides a communication mean between a user and his/her
counterpart agent (Personal Agent). In addition, it provides an input tool for
user’s search preferences. The second component is the search process that is
operated by a pattern matching. The third process is the parsing that is
operated by a text mining algorithm. The last part is the storage that is
managed by Monitor Agent. The proposed framework would be useful in providing an
alternative means of searching highly relevant studies from large databases. |
Keywords: |
Text Mining, Agent-based Model, Relevant Studies |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
AUTOMATIC QUESTION GENERATION FOR 5W-1H OPEN DOMAIN OF INDONESIAN QUESTIONS BY
USING SYNTACTICAL TEMPLATE-BASED FEATURES FROM ACADEMIC TEXTBOOKS |
Author: |
SETIO BASUKI, SELVIA FERDIANA KUSUMA |
Abstract: |
The measuring of education quality in school can be conducted by delivering the
examination to the students. Composing questions in the examination process to
measure students’ achievement in the school teaching and learning process can be
difficult and time consuming. To solve this problem, this research proposes
Automatic Question Generation (AQG) method to generate Open Domain Indonesian
Question by using syntactical approach. Open Domain questions are questions
covering many domains of knowledge. The challenge of generating the questions is
how to identify the types of declarative sentences that are potential to be
transformed into questions and how to develop the method for generating question
automatically. In realizing the method, this research incorporates four stages,
namely: the identification of declarative sentence for 8 coarse-class and 19
fine-class sentences, the classification of features for coarse-class sentence
and the classification rules for fine-class sentence, the identification of
question patterns, and the extraction of sentence’s components as well as the
rule generation of questions. The coarse-class classification was carried out
based on a machine learning with syntactical features of the sentence, namely:
Part of Speech (POS) Tag, the presence of punctuation, the availability of
specific verbs, sequence of words, etc. The fine-class classification was
carried out based on a set of rules. According to the implementation and
experiment, the findings show that the accuracy of coarse-class classification
reaches 83.26% by using the SMO classifier and the accuracy of proposed
fine-class classification reaches 92%. The generated questions are categorized
into three types, namely: TRUE, UNDERSTANDABLE, and FALSE. The accuracy of
generated TRUE and UNDERSTANDABLE questions reaches 88.66%. Thus, the obtained
results show that the proposed method is prospective to implement in the real
situation. |
Keywords: |
Automatic Question Generation (AQG), Coarse-class Classification, Fine-class
Classification, Open Domain Question, Syntactical Approach |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
ENHANCING DBPEDIA QUALITY USING MARKOV LOGIC NETWORKS |
Author: |
MAHMOUD ALI, MOHAMMED ALCHAITA |
Abstract: |
The linked data in the Web may be incomplete due to be extracted from
semi-structured sources such as Wikipedia, or unstructured such as text. There
are various approaches aim to complete the missing data in the linked data sets,
including the statistical distributions of properties and types for enhancing
the quality of incomplete and noisy Linked Data sets, which obtained good
results. In this study, we suggest using of Markov logic networks to improve
the quality of the unstructured Linked Data sets without using any external
knowledge. Markov Logic Networks (MLNs) is considered one of the most known and
proposed methods in the field of Statistical Relational Learning (SRL). It is a
first order knowledge base with attaching a weight to each formula. Markov Logic
Networks generalize the First - order Logic and attach a weight to each
equation. Therefore, we rely on RDF(S) and its associating entailment rules
which provide a data representation model. We carry out reasoning by
transforming the statements and constraints to Markov Logic and compute the most
probable consistent state with respect to the defined constraints. Results
showed that the proposed algorithm could infer on correct types which the
algorithm SDType couldn't. As well, the results were remarkably improved at
increasing the number of the steps. |
Keywords: |
Dbpedia, Type Completion, Markov Logic Network, Knowledge Graph, First Order
Logic. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
IMPLICATIONS OF PRIVACY PRESERVING K-MEANS CLUSTERING OVER OUTSOURCED DATA ON
CLOUD PLATFORM |
Author: |
ANURAG, DEEPAK ARORA, UPENDRA KUMAR |
Abstract: |
Data Mining has gained attention nowadays in the field of sales, marketing,
insurance and healthcare applications to name a few. Organizations aspire to
perform mining operations on their joint datasets for gaining trade benefits
while hiding own sensitive information. Owing to huge resource consumption and
less computational power, they often prefer to outsource their data on the cloud
platform for entire computation. As there is a risk of exposing the
organization's sensitive data from various mistrusted parties involves in it,
privacy becomes one of the major challenging issues in cloud computing. Authors
have proposed an algorithm were cloud server applies k means clustering on
encrypted data sets. A Trusted Party is assumed for key distribution and
management. Computations between each party are either performed mutually or via
Trusted Authority which involving exchange of sensitive data transfer of each
participating parties. Complexity of the algorithm has been analyzed and
compared with the existing approach and found that it is linearly depends upon
various parameters settings and hence is a better approach while maintaining
authenticity and data confidentiality between various participating parties
during the mining process. |
Keywords: |
Privacy Preserving Data Mining, Pailler Homomorphic Encryption, K-Means
Clustering, Cloud Platrorm, Use Case Diagram. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
EVALUATION OF EWOM APPLICATION IN MUSEUMS IN JAKARTA |
Author: |
ARTA MORO SUNDJAJA, SEVENPRI CANDRA, AMALIA RUSADI |
Abstract: |
This study aims to evaluate and assess the implementation of electronic word of
mouth in several museums in Jakarta by using Facebook Fan Page. This study used
a qualitative approach to explore the phenomena of public awareness about
museum. The population is Facebook Fan Page museum in Jakarta. The sampling
technique used in this research is disproportionate stratified random sampling.
The data collection used in this research is Facebook observation and the data
were analyzed using e-WOM Analytical Framework developed by Andrea Hausmann. The
finding is Museum Basoeki Abdullah actively manages its Facebook account and
updates it regularly. The content presented is attractive, varied, and
interesting with a broad use of media such as images and video content. Museum
Basoeki Abdullah also stimulates discussion and interaction through
competitions, thus inviting Facebook user engagement and active participation
from followers in this social media domain. The goal makes the page look more
attractive in general. The conclusions obtained from this study suggest that not
all museums utilized the Facebook Fan Page for developing the electronic word of
mouth. There is a need for the identification and development of best practice
in this domain so that museums management apply electronic word of mouth. The
implementation of best practice can attract people to visit the museum based on
appropriate benchmark for assessing and comparing their performance. |
Keywords: |
Electronic Word Of Mouth; Hausmann Analytical Framework; Museum Awareness;
Facebook; Social Media Marketing. |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
SMS SPAM DETECTION USING ASSOCIATION RULE MINING BASED ON SMS STRUCTURAL
FEATURESNOOR GHAZI M. JAMEEL |
Author: |
NOOR GHAZI M. JAMEEL |
Abstract: |
The popularity of using mobile phones has led to an increase in sending SMS
messages. SMS messages are considered as a rapid way of communication due to its
low cost and easy usage. As a result, SMS was target for many types of threats,
one of these is spamming. SMS spam is unwanted message sent to many mobile phone
users, and cause many problems like annoyance, consuming mobile network
bandwidth and other real threats like scam, stealing personal information and
installing malware. In this paper, a new system is proposed to detect spam SMSs
using Apriori algorithm. This algorithm is used to generate association rules
which applied to new SMSs to classify them into spam or legitimate. The system
used structural features only instead of textual features or tokens. These
features extracted from two publicly available datasets which consists of spam
and legitimate SMSs. The rules are generated with different minimum support and
minimum confidence values. The generated rules are applied to the test dataset
then the rules which achieved higher accuracy are used in the proposed system.
The aim of this work is presents a new and fast approach to detect spam SMS
using structural features only and to find out if structural features are enough
to detect spam SMSs instead of bag of words which depends on preprocessing and
consists of many steps like parsing, tokenization, stop word removal and
stemming. Good accuracy achieved with 97.65% using rules generated by Apriori
association rule mining algorithm with minimum support 0.2 and minimum
confidence 0.8 based on SMS structural features only. |
Keywords: |
SMS, SMS Spam, Association Rule Mining, Apriori Algorithm |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
Title: |
RTDBSTREAM: A REAL-TIME DENSITY-BASED CLUSTERING FOR EVOLVING DATA STREAMS |
Author: |
K. SHYAM SUNDER REDDY, C. SHOBA BINDU |
Abstract: |
Density-based clustering method has come into existence as a prominent class for
clustering data streams. It has the ability to discover clusters with arbitrary
shape, and it can handle noise in data. Recently, several density-based
clustering algorithms have been proposed in the literature for clustering data
streams. But each algorithm has its own limitation that renders them ineffective
and makes a new algorithm necessary for dealing with big data. Existing
density-based clustering algorithms require high computation time and more
memory for clustering process. In this paper, we present a novel density-based
clustering algorithm called Real-time Density-based Clustering (RTDBStream) for
evolving data streams. This algorithm is a hybrid density-based clustering
algorithm that integrates the pros of density-grid and density micro-clustering
algorithms to get better results. The quality of the proposed algorithm is
evaluated on various data sets with distinct characteristics using different
quality metrics. |
Keywords: |
Big data, Data stream, Density-based clustering, Grid-based clustering,
Micro-clustering |
Source: |
Journal of Theoretical and Applied Information Technology
30th June 2018 -- Vol. 96. No. 12 -- 2018 |
Full
Text |
|
|
|