|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
April 2016 | Vol. 86 No.2 |
Title: |
A DIFFERENTIAL EVOLUTION ALGORITHM PARALLEL IMPLEMENTATION IN A GPU |
Author: |
LAGUNA-SANCHEZ G. A., OLGUIN-CARBAJAL M., CRUZ-CORTES N., BARRON-FERNANDEZ R,
CADENA MARTINEZ R. |
Abstract: |
The computational power of a Graphics Processing Unit (GPU), relative to a
single CPU, presents a promising alternative to write parallel codes in an
efficient and economical way. Differential Evolution (DE) algorithm is a global
optimization based on bio-inspired heuristic. DE has a good performance, low
computational complexity and need few parameters. This article presents parallel
implementation of this population-based heuristic, implemented on a NVIDIA GPU
device with multi-thread support and using CUDA as the model of parallel
programming for these case. Our goal is to give some insights about GPUs
parallel programming by a simple and almost straightforward parallel code, and
compare the performance of DE algorithm running on a multithreading GPU. This
work shows that with a parallel code and a NVIDIA GPU not only the execution
time is reduced but also the convergence behavior to the global optimum may be
changed in a significant manner with respect the original sequential code. |
Keywords: |
Multithreading, Parallel Programming, GPU, Differential Evolution And Fine
Grain. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
OPTIMAL SIZING OF GRID-CONNECTED PV-WIND SYSTEM CASE STUDY: AGRICULTURAL FARM IN
MOROCCO |
Author: |
Mohammed BOUSSETTA, Rachid ELBACHTIRI, Karima ELHAMMOUMI, Maha KHANFARA |
Abstract: |
Faced with the reality of limited fossil fuel reserves and the issue of
greenhouse gases, agriculture, like other economic sectors, must develop
efficient production systems in energy inputs. Within this context, this paper
focuses on the problem is to optimize and size a power generation grid-connected
system, resulting from the combination of renewable energy sources (RES), to
supply power an agricultural farm of 37 hectares and a cold room for
conservation of summer fruit, in the region of Sefrou-Morocco. In this article,
economic study of hybrid electricity generation from the site was undertaken,
with the aim of minimize the net present cost (NPC) which is the sum of the cost
of the initial investment and the net present value of all costs of operation
and maintenance over the estimate life of the project, which is 25 years. The
environmental analysis was also carried out; emissions and renewable energy
generation fraction (RF) are calculated as the main environmental indicator.
Simulation results show that the hybrid energy system (HES) comprising one wind
turbine of 100 kW, 15 kW of PV arrays and 40kW sized power converters connected
to the grid, is the best option to meet the requirement of the load demand
considered in this study, thereby producing 323,815kWh/yr, which 81% of
electricity comes from RES and the remaining energy is provided by the grid, as
well HES reduce about 51,000 Kg of emissions per year. In addition, a
sensitivity analysis was performed considering four average values of wind speed |
Keywords: |
Grid-Connected PV-Wind System; Net Present Cost; Homer; Optimal System;
Agricultural Farm |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
USING SOCIAL MEDIA AND CRM TECHNOLOGY IN E-RECRUITMENT |
Author: |
RAWAN ALMUSA, DR.WAFI ALBALAWI |
Abstract: |
The online recruitment is one of the modern tool, adopted by prospective job
seekers or employers to initiate the employment process. The latest technologies
that facilitate the recruitment process are branch out from Web 2.0, such as
social media and social customer relationship management (CRM). It is evident
that companies are shifting trying from traditional CRM to social CRM in order
to understand better customers’ need and offer optimal employment solutions. In
this paper, researchers has address the role of social CRM towards the online
recruitment process by analyzing intelligent solutions that assist recruitment
companies to satisfy customers’ needs. |
Keywords: |
Recruitment, Social Media, Customer Relationship Management (CRM), Job Seeker,
Employer. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
IMPROVEMENT KEYS OF ADVANCED ENCRYPTION STANDARD (AES) RIJNDAEL_M |
Author: |
MOHANAAD SHAKIR, ASMIDAR BIT ABUBAKAR, YOUNUS BIN YOUSOFF, MUSTEFA SHEKER |
Abstract: |
Rijndael is a specification for the encryption of electronic data that
considered as a collection of ciphers with distinct block and key sizes. This
study aims to develop the key of Rijndael cipher in order to enhance the level
of confusion and diffusion. The tools of analysis, design, implementation,
testing, and evaluation have been applied by using the model of software system
development life cycle (SDLC). The results of the study show that adding keys to
the Rijndael will increase its security level and promote widely use in the
organizations. |
Keywords: |
Information System Security, Cipher Algorithm, AES |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
AN UNSUPERVISED CLASSIFICATION TECHNIQUE FOR RECOGNITION OF SCRATCHED AND
NON-SCRATCHED WORDS IN PRE-PRINTED DOCUMENTS |
Author: |
N. SHOBHA RANI, VASUDEV T, VINEETH .P, DEEPTHA AJITH |
Abstract: |
Pre-processing of document images is the most variant factor from one type of
document image to another. In general, especially document images require more
intensive pre-processing procedures than other type of images; one of such
categories is pre-printed form images. Pre-processing of such documents is
different from other type of images containing simple text and free from
graphical components. This paper proposes a generic pre-processing algorithm
adaptable for pre-printed application form images. The work supports
specifically on problem of detection and removal of scratched words inherent in
the text, since these elements are interpreted neither by humans nor by
machines. The algorithm exploits the features like Euler’s number, number of
connected components and area covered by holes with in a text block for
detection of scratched out text blocks. The algorithm has yielded reasonably
good results with an overall efficacy of around 96.5%. |
Keywords: |
Irrelevant Information, scratched words, non-scratched words, Morphological
Operations, Pre-printed forms, unsupervised learning. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
REDUCED FILE HASH FOOT PRINTS FOR OPTIMIZED DEDUPLICATION IN CLOUD PLATFORMS |
Author: |
M.JYOTHIRMAI, Dr.K.THIRUPATHI RAO |
Abstract: |
Cloud Data Storages decreases colossal burden on clients as for their
neighborhood stockpiles yet acquaints new issues with deference with information
copies in the cloud. Albeit some prior methodologies managed the issue of
actualizing a way to deal with handles cloud security and execution as for
de-duplication by appropriately characterizing the concerned gatherings in the
cloud and summoning document signature distinguishing proof procedure utilizing
customary hash message validation code (HMAC). Because of these hash code
calculations like SHA-1 and MD5 the document trustworthiness qualities are
colossal prompting idleness variable at the de-duplication estimation. Because
of this above issue the capacity exhibit obliges earlier trustworthiness hash
codes prompting execution issues. In this paper, we propose a Genetic
Programming way to deal with record deduplication that joins a couple of
unmistakable bits of affirmation removed from the information substance to
discover a deduplication limit that has the limit perceive whether two sections
in a store are duplicates or not. As appeared by our trials, our methodology
beats a current cutting edge technique found in the writing. Additionally, the
proposed capacities are computationally less requesting since they utilize less
confirmation. Furthermore, our hereditary programming methodology is prepared to
do consequently adjusting these capacities to a given settled copy ID limit,
liberating the client from the weight of choosing and tune this parameter. |
Keywords: |
Hybrid Cloud Computing, Cloud Security, SHA, MD5, Message Authentication Codes,
Genetic Programming, Cross-Over Mutation, Similarity Function, And Checksum. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
TOWARDS A SELF-ADAPTIVE AGENT-BASED SIMULATION MODEL |
Author: |
YIM LING LOO, ALICIA Y.C. TANG, AZHANA AHMAD, AIDA MUSTAPHA |
Abstract: |
Agent-based simulation (ABS) modelling has been a widely applied approach for
simulating domain-specific phenomena. Currently, parameters and environments are
simulated by a domain-specific model that is strictly used proprietarily by the
ABS model developer. This causes inflexibility towards extension of the
developed ABS model, which will further result in difficulties for validation
and verification of the robustness and reliability of the ABS model. To address
this issue, this paper proposes a self-adaptive ABS model that is capable of
modelling cross-domain phenomena by selecting the required parameters based on
the environment. The capability to self-adapt will allow the model to be easily
extended and replicated. The self-adapt capability is enabled by a governing
algorithm within the model and is conceptually illustrated through a case study
of crime report process ABS modelling. |
Keywords: |
Self-adaptive Model, Agent-based Simulation Model, Processes Simulation,
Extensive Model, Model Reuse and Replication |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
ONTOLOGY POPULATION FROM QURANIC TRANSLATION TEXTS BASED ON A COMBINATIONOF
LINGUISTIC PATTERNS AND ASSOCIATION RULES |
Author: |
TAHER WEAAM, SAIDAH SAAD |
Abstract: |
With the increasing volume of English translation of Islamic documents available
on the web, there is a need to retrieve and extract important information in
order to fully understanding these documents. Understanding the Quran is a grand
challenge for society, for western public education, for Muslim-world education,
for knowledge representation and reasoning and for knowledge extraction from
text. Ontology learning from the Quran text is challenging task due to the
nature of the Quran text which has scattered organization of knowledge and its
unique feature. This paper illustrates an ontology learning based on a hybrid
method which combines lexico-syntactic patterns and association rules for
English translation of the meaning of the Quran text. First, this paper designs
a new two layers of filtering method which combine linguistic and statistical
methods for concept extraction. Second, this work designs a new hybrid method
based on lexico-syntactic patterns and association rules method for relation
extraction. The results showed that using the two layers of extraction prove to
be adequate and efficient measures for automatic extraction of Quranic concepts
with an overall F-measure of 85.3%. In addition, the results obtained indicate
that the used methods are very suitable technique for extracting relation from
with an overall F-measure of 87.3% and 88.3% respectively. |
Keywords: |
Ontology Learning, Statistical Methods, Pattern Extraction, Association Rules,
Quran |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
IMPACT OF MULTI-OBJECTIVE GENETIC PROGRAMMING TREE REPRESENTATIONS ON FEATURE
EXTRACTION AND CLASSIFICATION |
Author: |
KHALED M. BADRAN |
Abstract: |
In this paper three different genetic programming tree representations are used
for classification. Two of them are employed as direct classifiers while the
last one works as feature extractor before applying a simple threshold
classifier. Each type of representation is discussed with the needed
modification required for applying the evolutionary operators including tree
generation, crossover and mutation. The three GP methods are applied to real
world five datasets from the UCI machine learning database to verify approaches.
The performance of the three approaches is compared to conclude the most
suitable tree representation for feature extraction and classification |
Keywords: |
Genetic Programming, Feature Extraction, Classification, Tree Representation,
Multi-Objective |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
GOMPERTZ BASED SPRT: MLE |
Author: |
Dr. R. Satya Prasad, V. Suryanarayana, Dr. G. Krishna Mohan |
Abstract: |
Sequential Analysis of Statistical science could be adopted in order to decide
upon the reliability / unreliability of the developed software very quickly. The
procedure adopted for this is, Sequential Probability Ratio Test (SPRT). It is
designed for continuous monitoring. The likelihood based SPRT proposed by Wald
is very general and it can be used for many different probability distributions.
The parameters are estimated using Maximum Likelihood Estimation (MLE). In the
present paper, the Gompertz model is used on five sets of existing software
reliability data and analyzed the results. |
Keywords: |
Gompertz, Sequential Probability Ratio Test, MLE, Decision lines, Software
testing, Software failure data. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
BATIK PRODUCTION PROCESS OPTIMIZATION USING PARTICLE SWARM OPTIMIZATION METHOD |
Author: |
INDAH SOESANTI , RAMADONI SYAHPUTRA |
Abstract: |
This paper presents batik production process optimization using particle swarm
optimization (PSO) methods. The process steps of batik are designing, ‘nyanting’,
staining, ‘pelorodan’, and washing. Batik production cost is determined by the
efficiency and calculation of output to input ratio in the production process.
In this research, PSO method is used for production process optimization on
Yogyakarta batik industry. The main objective of this research optimization is
to minimize the cost incurred in the production process of batik in order to
obtain maximum benefit. PSO method has successfully optimized the production
goals to be achieved by minimization the using of raw materials and production
time. Optimization results show that in batik production process, there is a
saving of raw materials 14.801% and production time saving of time 10.345%. |
Keywords: |
Batik; Production Process; Optimization; Particle Swarm Optimization |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
IDENTIFICATION OF PRINTING PAPER BASED ON TEXTURE USING GABOR FILTERS AND LOCAL
BINARY PATTERNS |
Author: |
SHIHAB HAMAD KHALEEFAH, MOHAMMAD FAIDZUL NASRUDIN |
Abstract: |
There are many causes of deformation in an image and one of which during its
acquisition to a digital image. The deformation takes different forms or causes
different effects on the acquired image comparing with the original image
including poor resolution, shear, noise, variation in the intensity and etc. A
paper scanned by a scanner is a good example of possible deformation in images.
Consequently, paper texture identification or fingerprinting is one of the
research fields of pattern recognition that exposed to the deformation problem.
Applications such as documents authentication deemed to be constrained by the
deformation problem. Subsequently, one of the well-known methods in images
texture extraction is the Locale Binary Pattern (LBP) method. However, the LBP
method show a number of drawbacks in paper images texture extraction and two of
which are neglecting some texture information of the images and incompetent to
some images deformation due to its local view. In this paper combinations of
Gabor filters and a LBP operator are proposed to reduce the effects of the
mentioned drawbacks in papers fingerprinting domain. We use self-collected
textures from 102 paper images in the test. The images are acquired in three
resolutions of 50 DPI, 100 DPI and 150 DPI in order to manifest robust results.
Consequently, the testing results of the proposed combinations improve paper
images identification accuracy. This paper finds that applying Gabor filters
prior to LBP method improve the LBP operator description and the fingerprinting
accuracy. |
Keywords: |
Pattern recognition, Paper fingerprinting, Local Binary Pattern (LBP), Gabor
Filters(GF), Gabor Filter Local Binary Pattern (GFLBP), Chi square |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
OUTAGE PROBABILITY ANALYSIS OF AMPLIFYANDFORWARD AND DECODEANDFORWARD DUAL HOP
RELAYING WITH HARDWARE DEFECTS |
Author: |
C.S.PREETHAM, Dr. M SIVA GANGA PRASAD, N. PRATAP REDDY, L. SASHANKA TEJA |
Abstract: |
The hardware defects creates distortions in communication systems at both
transmission and reception process, which usually reduce the communication
System performance. We get misleading results at both transmission and reception
ends. The huge number of contributions in the area of relaying neglect hardware
defects of both transmitter and reception thus assume ideal hard ware .Such
assumptions are used in low rate systems but not applicable to high rate
systems. This paper derives the behavior of performance limitations for both
amplify-and-forward and decode-and-forward protocols of dual hop relaying
system. We also derive the outage probity analysis of the effective end-to-end
signal-to-noise-and-distortion ratio (SNDR). This paper considers the defects at
source, relay, and destination and derives the closed-form expressions for the
exact and asymptotic Outage probability. |
Keywords: |
Dual Hop Relaying, Amplify And Forward, Decode And Forward, Outage Probability,
Nakagami-M Fading |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
EVALUATION OF POWER DISTRIBUTION NETWORK SYSTEM BY USING SELF-ORGANIZING MAP (SOM)
AS THE BEST PRACTICE FOR BUSES CHARACTERISTIC CLASSIFICATION |
Author: |
M.F. SULAIMA, M.H. JALI, Z.H. BOHARI , M.F. BAHAROM, N. BAHARIN |
Abstract: |
The Artificial Neural Network is a technique that modeled from the ways of human
brain acts. The unsupervised training is a self-learning process which is
performing a classification without outside teaching help. The Self-Organizing
Map is an algorithm that consists process of generated neurons to organize by
themselves. This paper proposes a way of analysis of the Self-Organizing Map (SOM)
applied to the classification of the standard IEEE 33-bus and 69-bus
distribution data. The distribution bus data were classified based on four main
features: active power (MW), reactive power (MVAr), apparent power (MVA), and
power factor (pf). These features are the input for SOM classification. The
analysis of SOM result has shown the capability of this algorithm as a
classification method in order to classify the distribution bus data. |
Keywords: |
Self-Organizing Maps (SOM), IEEE 33-bus Distribution Data, IEEE 69-bus
Distribution Data |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
DISCREET RISK-MODELS OF THE PROCESS OF THE DEVELOPMENT OF VIRUS EPIDEMICS IN
NON-UNIFORM NETWORKS |
Author: |
VICTORIA VIKTOROVNA ISLAMGULOVA, ALEXANDER GRIGORYEVICH OSTAPENKO, NIKOLAY
MIKHAILOVICH RADKO, RUSLAN KALANDAROVICH BABADZHANOV, OLGA ALEXANDROVNA
OSTAPENKO |
Abstract: |
The presented paper discusses networks with a non-uniform node degree during the
development of a virus epidemic. The study is based on a layerwise simulation of
a process of a virus infection of a non-uniform network. That approach is quite
novel and less studied, but its practical application seems promising. The
feature of the proposed model is that all nodes are divided into layers, which
provides a certain hierarchy consideration of a network in the context of its
infection. It was established that the process of an epidemic can develop by
various scenarios: upward, downward, along layers and, finally, in all mentioned
directions. For calculations we proposed to use a matrix with layerwise internal
coherence of a network. That matrix allows to considerably simplify
calculations. As the result of the study we obtained analytical expressions for
the calculations of risk, damage and epidemic resistance on that basis. The
results allowed to propose a generalized flowchart of the algorithm of layerwise
simulation of an epidemic process in a network ("star parade" algorithm). That
algorithm has a number of advantageous properties: a possibility to consider
several sources of infection, heterogeneous value of nodes, mutations of a virus
during an epidemic process, correction for a ratio of infected nodes at various
stages of analysis. |
Keywords: |
Risk, Networks, Epidemic Resistance, Fractal, Epidemic. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
MULTI-CRITERIA DECISION MAKING FOR SELECTING SEMANTIC WEB SERVICE CONSIDERING
VARIABILITY AND COMPLEXITY TRADE-OFF |
Author: |
NANANG YUDI SETIAWAN, RIYANARTO SARNO |
Abstract: |
Business process decomposition helps to focus on the base business process
rather than the whole business processes with a higher complexity. Variability
of business process decomposition taken into a different complexity level within
common business process. By considering business process variability and
complexity trade-off, we selected the best solution for semantic web service
alternatives with different service quality. The selected web service should be
fulfilled most of service quality capability defined in user’s preferences. For
our case study we had chosen the Penerimaan Peserta Didik Baru, an online
student submission system consists the registration procedure for multi-degree
of school, multi-selection (regular, inclusion, pre-welfare), and multi-phase
for every educational level. The base process is selected from four basic
models, and we identified the variant models that representing the behavior of
all possible requirements. In this paper, semantic service selection is used by
discovering semantic web services through their service registry. Web services
was annotated with its business process formalization and quality attributes.
However, the variability and complexity in business process decomposition are
contradictory, thus, constitute a trade-off. We interpreted it as the
multi-criteria decision problem and proposed Fuzzy AHP+TOPSIS method to bring
the best optimum solution that reflect the needs and preferences of the decision
maker. This approach proved to solve the multi-criteria decision problem for
selecting the best options of semantic web services considering the trade-off
among variability and complexity of business process. In our study, we had
tested 153 service requests and gained a precision of 91.3%, and a recall or
sensitivity of 89.4% that result the harmonic mean of precision and recall of
0.903. Our approach is success to deliver the most preferred number of business
process variant with minimum complexity level in accordance with the acceptable
service quality (service cost, capacity, and latency) delivered by service
providers. |
Keywords: |
Business Process, Variability, Complexity, Semantic Web Service, Multi-Criteria
Decision Making |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
Title: |
GELISP: A FRAMEWORK TO REPRESENT MUSICAL CONSTRAINT SATISFACTION PROBLEMS AND
SEARCH STRATEGIES |
Author: |
MAURICIO TORO, CAMILO RUEDA, CARLOS AGÓN, GÉRARD ASSAYAG |
Abstract: |
In this article we present Gelisp, a new library to represent musical Constraint
Satisfaction Problems and search strategies intuitively. Gelisp has two
interfaces, a command-line one for Common Lisp and a graphical one for OpenMusic.
Using Gelisp, we solved a problem of automatic music generation proposed by
composer Michael Jarrell and we found solutions for the All-interval series. |
Keywords: |
Constraint Satisfaction Problems, Openmusic, Automatic Music Generation, Search
Strategies, Visual Programming. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2016 -- Vol. 86. No. 2 -- 2016 |
Full
Text |
|
|
|