|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
April 2017 | Vol. 95 No.7 |
Title: |
MEASURING PROCESS INNOVATION ON DOUBLE-FLANKED CONCEPTUAL MODEL FOR KNOWLEDGE
SHARING ON ONLINE LEARNING ENVIRONMENT |
Author: |
S.M.F.D SYED MUSTAPHA, BIJU THERUVIL SAYED, ROSHAYU MOHAMAD |
Abstract: |
There are various innovation models that were discussed in the literature and
the adoption is based on the organizational needs for their business contexts,
vision and applications. These innovation models require effective innovation
process framework to be followed. SECI Model has been chosen as knowledge
creation model to facilitate innovation through knowledge sharing and creation.
While literature has shown that SECI model has been applied in various fields
such as management, manufacturing, education and business, very few has
considered it as innovation tool for online learning environment. Knowledge
creation requires community who has enculturated with knowledge sharing as part
of the practices. For this purpose, Community of Practice (CoP) has been chosen
as the essentialities for the prospective innovative community and consequently
to make implementation of SECI model a success. Community with CoP values are
postulated to provide the right organizational setting for innovation. It is
suggested that both SECI Model and CoP are integrated as new a conceptual model
being regarded as double flank strategy that synergizes to prepare the right
community setting and to facilitate innovation through knowledge creation.
Subsequently, this paper proposed the methods and approaches in measuring
innovativeness in online learning environment based on the double flank
conceptual model called DFCMI. |
Keywords: |
Knowledge Management, SECI Model, Community of Practice, Online Learning,
Measuring Innovation |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
QUESTION ANSWERING SYSTEM SUPPORTING VECTOR MACHINE METHOD FOR HADITH DOMAIN |
Author: |
NABEEL NEAMAH, SAIDAH SAAD |
Abstract: |
Retrieving accurate answers based on users query is the main issue of question
answering systems. Challenges such as analyse the need of users query and
extract accurate answers from large corpus are increase the difficulty of
developing effective question answering system. This work aims to enhance the
accuracy of question answering system for hadiths using useful methods.
Pre-processing methods like tokenization and stop-word removal is used to
identify the main concepts of users query. Answering processing methods and
techniques like N-gram, WordNet, CS, and LCS are used to update and enrich the
extracted concepts of users query based on the formal representation of hadiths
answers or documents. Support Vector Machine (SVM) and Name Entity Recognition (NER)
methods are conducted to classify Hadiths documents based on relevant subjects
and questions types in order to reduce the searching scope of answers documents.
Documents in Hadith corpus are classified according to proposed question types,
and related subjects as four main classes which are: when for pray, where for
pray, when for fasting, and where for fasting. The SVM classification of
documents is accomplished supporting NER methods to identify the places (where)
and time (when) features that included in the documents. The proposed question
answering system is tested using 132 Hadiths documents about Fasting and Pray
that are selected from Al-Bukhari source. The findings revealed that the average
answers accuracy using CS technique is 67%, the average answers accuracy using
LCS technique is 66%, the average answers accuracy using combination of CS and
LCS techniques is 70%, and the average answers accuracy using CS, LCS, and SVM
is 80%. SVM enhance the system accuracy up to 10% more than using other methods
without classification processes. The main contribution of this research is
using SVM method to reduce searching scope of Hadiths documents based on various
subjects and question types beside effective analysis of query need using NLP
methods. SVM provides more accurate answers than extracting answers using only
similarity techniques such as CS and LCS. |
Keywords: |
Question Answering System, Hadiths, Pre-processing, Answers Processing, SVM, NER. |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
A NOVEL ALGORITHM FOR BIG DATA CLASSIFICATION BASED ON LION OPTIMIZATION |
Author: |
NAVNEET, NASIB SINGH GILL |
Abstract: |
This paper develops a novel big data classification algorithm based on a nature
inspired meta-heuristic algorithm (lion optimization algorithm). Lion
optimization algorithm is an optimization algorithm based on the hunting and
social behaviour of the lion. The developed algorithm uses the K-mean clustering
to generate the pride and nomad. Then the hunting and migration behaviour of the
lion is repeated to change pride and optimize the process. The proposed
calculation is dissected by adding the proposed calculation to the WEKA library
on the Intel i5 @ 2.67 GHz utilizing the Eclipse IDE. The calculation is
examined on the datasets having 400 occasions with 25 qualities and 32561
examples with 15 properties. The algorithm has been analyzed on different
datasets using Tp rate, Fp rate, accuracy, recall and f-measure as parameters.
The result analysis shows the optimization of the algorithm. |
Keywords: |
Big Data, Lion Optimization Algorithm, Classification, Accuracy, Meta-Heuristic,
Nature Inspired Algorithm |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
KEY EXCHANGE AUTHENTICATION PROTOCOL FOR NFS ENABLED HDFS CLIENT |
Author: |
NAWAB MUHAMMAD FASEEH QURESHI, DONG RYEOL SHIN, ISMA FARAH SIDDIQUI |
Abstract: |
By virtue of its built-in processing capabilities for large datasets, Hadoop
ecosystem has been utilized to solve many critical problems. The ecosystem
consists of three components; client, Namenode and Datanode, where client is a
user component that requests cluster operations through Namenode and processes
data blocks at Datanode enabled with Hadoop Distributed File System (HDFS).
Recently, HDFS has launched an add-on to connect a client through Network File
System (NFS) that can upload and download set of data blocks over Hadoop
cluster. In this way, a client is not considered as part of the HDFS and could
perform read and write operations through a contrast file system. This HDFS NFS
gateway has raised many security concerns, most particularly; no reliable
authentication support of upload and download of data blocks, no local and
remote client efficient connectivity, and HDFS mounting durability issues
through untrusted connectivity. To stabilize the NFS gateway strategy, we
present in this paper a Key Exchange Authentication Protocol (KEAP) between NFS
enabled client and HDFS NFS gateway. The proposed approach provides
cryptographic assurance of authentication between clients and gateway. The
protocol empowers local and remote client to reduce the problem of session
lagging over server instances. Moreover, KEAP-NFS enabled client increases
durability through stabilized session and increases ordered writes through HDFS
trusted authorization. The experimental evaluation depicts that KEAP-NFS enabled
client increases local and remote client I/O stability, increases durability of
HDFS mount, and manages ordered and unordered writes over HDFS Hadoop cluster. |
Keywords: |
Hadoop, HDFS, NFS Gateway, Security, Reliability. |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
ARBANTENOTONAN: A LEARNING MEDIA BASE ON AUGMENTED REALITY TRADITIONAL BALINESE
BIRTHDAY CEREMONY EQUIPMENT |
Author: |
A.A.K. OKA SUDANA, I WAYAN MEI SUJANA, NI KADEK DWI RUSJAYANTHI |
Abstract: |
Yadnya ceremony is a important thing in Bali. This yadnya ceremony is worth to
be preserved through ICT. It is due to its meaning, function and specified
purposes which are influenced by the exotic culture and local values of Bali
itself. Otonan ceremony is an important implementation of manusia yadnya which
is the part of yadnya itself. It is a ceremony that is held in order to
celebrate ones birth date based on Balinese Wuku Calender. It is being held
once every 210 days or once in 6 month on purpose to purify ones body
physically and spiritualy. The production of Banten Otonan has been
significantly being forgotten by the young generation due to the limited
information as well as the difficulties that ones may face during the process.
The process of making Banten Otonan and Sampian are wrapped in form of Augmented
Reality based educative application on android based smartphone. It is expected
to be able to help users in recognizing, realizing and understanding the whole
process of making Banten Otonan and Sampian. This application is using
3-dimensional animation model as well as 3D animation video. The result that is
obtained by this application can give us the information in form of multimedia
of how to make Banten Otonan and Sampian. |
Keywords: |
Augmented Reality, Banten Otonan, Balinese Birthday Ceremony, Hindu Religion |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
MINIMIZING THE CHANNEL SWITCHING EVENTS FOR QOS-BASED ROUTING IN COGNITIVE RADIO
AD-HOC NETWORK |
Author: |
TAUQEER SAFDAR, HALABI HASBULLAH, TANZILA SABA, AMJAD REHMAN |
Abstract: |
Wireless network connectivity systems have very short capacity to adhere the
changes due to spectrum mobility and user interference to maintain the Quality
of Service (QoS) parameters during end-to-end routing in Cognitive Radio Ad-Hoc
Network (CRAHN). The reconfiguration of the network layer parameters in
secondary users is challenging and demanding in case of sudden arrival of
primary user on its licensed channel and spectrum mobility. Whenever, secondary
user senses the primary user activity called as user interference, secondary
user has to switch to any other available channel to continue its transmission.
This channel switching increases due to the user interference and spectrum
mobility which degrades the average data rate. Hence, it will effect directly on
the QoS-based end-to-end routing in CRAHN. The addition of reinforcement
learning techniques in network management can reduce the channel switching
events and user interference by improving the QoS-based routing. This paper
presents an algorithm for channel selection in cross-layer approach to minimize
the number of channel switching events for QoS-based routing in CRAHN. The
methodology is based on the previous network state observation of the primary
user for its channel selection and secondary user will explore it for future
routing decisions. It can be implemented using a learning agent in a cross-layer
approach and modifying some existing routing parameters of Ad-Hoc On-Demand
Distance Vector (AODV) routing protocol. This methodology is also very useful as
the existing routing protocol can be modified for Cognitive Radio Ad-Hoc Network
(CRAHN). We provide a self-contained learning method based on
reinforcement-learning techniques which can be used for developing QoS-based
routing protocols for CRAHN. We simulated the proposed algorithm using Cognitive
Radio Cognitive Network (CRCN) simulator based on NS-2. The results are
evaluated and compared with another routing protocol for CRAHN on the basis of
some QoS parameters for the proposed algorithm. The results are evaluated and
compared with the existing AODV routing protocol on the basis of some QoS
parameters for the proposed algorithm. The proposed methodology can provide the
basic use of Artificial Intelligence in routing protocols for CRAHN. |
Keywords: |
Channel Switching; User Interference; Reinforcement Learning; Routing Protocols;
QoS. |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
A STUDY OF GENERATING ABSTRACT TEST FOR REQUIREMENTS VALIDATION AMONG
REQUIREMENTS ENGINEERS |
Author: |
NOR AIZA MOKETAR, MASSILA KAMALRUDIN, MOKHTAR MOHD. YUSOF, SAFIAH SIDEK, MARK
ROBINSON |
Abstract: |
Requirements testing or requirements-based testing (RBT) is one of the software
testing techniques that is found effective to test requirements completeness
and accuracy. This technique involves systematic way of test case generation
from the model of the requirements specification. This technique has been
applied in the requirements analysis phase to detect and eliminate requirements
defects before the next stage of software development project. Although this
technique is useful, it is tedious and time consuming to manually generate
abstract test from the requirements model. However, we argue that the tedious
process can be minimised if the requirements engineer have the good ability
(skill) to generate abstract test from requirements models for requirements
validation. This paper described a study of requirements engineer manually
generate abstract tests from requirements model: Essential Use Cases (EUC)
model. From the result, we discover that software requirements engineers are not
well equipped with the skill and technique to generate abstract tests from
requirements model. |
Keywords: |
Requirements Validation, Requirements-Based Testing, Abstract Tests, Test
Requirements, Test Cases |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
FINITE-DIFFERENTIAL SCHEME OF IDENTIFICATION OF OVERALL HEAT EXCHANGE
COEFFICIENT |
Author: |
A. BAIMANKULOV, A. ISMAILOV, T. ZHUASPAYEV |
Abstract: |
This work studies one-dimensional problem of heat distribution in matter. The
measured value of soil temperature and near ground air temperature are set.
Iteration method is proposed for defining overall heat exchange coefficient of
multilayer material. The method is realized with the aid of finite-differential
scheme, which gives solution, converging to the solution of differential
problem. The result of numerical solutions of test problems are given. |
Keywords: |
Heat Emission Coefficient, Finite-Differential Scheme, Iteration, Primal And
Conjugate Problems, Functional, Initial Boundary Conditions |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
INVESTIGATION ON BER SENSITIVITY TO DESIGN SNR OF BHATTACHARRYA BOUNDS BASED
CONSTRUCTION OF POLAR CODES |
Author: |
REDA BENKHOUYA, IDRISS CHANA, YOUSSEF HADI |
Abstract: |
Advanced coding has been widely used to accomplish the high-performance
requirements of wireless communications. While adhering to the perspective on
energy-spectral efficiency, channel coding is still promising. To deal with such
challenge, research initiatives on the linear block error correcting codes have
gained accelerating momentum. In this paper we introduce polar codes which have
proven to meet the typical use cases of the next generation mobile standard.
Such work is motivated by the suitability of polar codes for the coming wireless
era. Hence, we investigate the performance of polar codes in terms of bit error
rate (BER) for several codeword lengths and code rates. We first perform a
discrete search to find the best design signal to noise ratio (SNR) at two
different code rates, while varying the blocklength. We find in our extensive
simulations that the BER becomes more sensitive to design SNR as long as we
increase the blocklength and code rate. Finally, we note that increasing
blocklength achieves an SNR gain, while increasing code rate changes the
operational SNR domain. This trade-off sorted out must be taken into
consideration while designing polar codes for high-throughput application. |
Keywords: |
Polar Codes, Battacharrya Parameter, Successive Cancellation Decoding, Design
SNR, BER |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
LINKING SOFTWARE ENGINEERING PARADIGMS TO ISLAMIC VALUES |
Author: |
BURHAN UL ISLAM KHAN, BISMA RASOOL, M. MUEEN UL ISLAM MATTOO, AJAZ AHMAD HURRAH,
BINYAMIN ADENIYI AJAYI, RASHIDAH F. OLANREWAJU |
Abstract: |
In general, Muslims all over the world have an innate tendency to hold fast to
Islams teachings as narrated in the Quran and Hadith. The present study is an
investigation on the utilization of this adherence for improving the standards
of ethical behavior of Muslim IT professionals, particularly software engineers.
The principal point of this paper is to develop the importance of ethics among
software engineers in order to make them realize the impact of various immoral
practices in their field e.g., property violations, general software upgrading,
design methodology, software privacy, etc. The moral values put forward in
code-of-conduct have been scrutinized from Islamic point of view by studying the
same in light of verses in the Quran and Hadith of our beloved Prophet Muhammad
(P.B.U.H.). It is high time for software engineers and developers to accept the
dire need of a paradigm shift in software engineering that integrates divine
revelation with reason. The paper, therefore, has an Islamic but global approach
towards software engineering paradigms. |
Keywords: |
Code of Ethics, Ethical Issues, Information Ethics, Islamic Ethics, Software
Engineering Ethics |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
IMPLEMENTATION OF ASSOCIATION RULE METHOD AND TOPSIS METHOD TO DECISION SUPPORT
SYSTEM FOR DETERMINING EPIDEMIC DENGUE BASED ON RISK FACTORS ASSOCIATION |
Author: |
ERMATITA, FATMALINA FEBRY |
Abstract: |
Endemic prevention of harmful diseases such as dengue fever must be handling
seriously to minimize the risk posed by the disease. Dengue Hemorrhagic Fever (DHF)
is a disease that causes area endemic. There are various risk factors that can
lead to endemic dengue. These risk factors usually associated with one another
that could provide great potential occurrence of endemic dengue. This endemic
must be overcome in order to save from dengue fever. This study developed a
decision support system for prevention of dengue based risk factors association.
The modeling of Decision support system used the method of Association Rule
combined Technique For Others Reference method by Similarity to Ideal Solution (TOPSIS).
Results of this research is a recommendation for decision making handling of
endemic dengue based on risk factors associated. The system can be handling
prevention of dengue fever endemic that can be addressed quickly. |
Keywords: |
Decision Support Systems, TOPSIS, Endemic, Dengue Fever |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
TECHNIQUES FOR HANDLING IMBALANCED DATASETS WHEN PRODUCING CLASSIFIER MODELS |
Author: |
ROZIANIWATI YUSOF , KHAIRUL AZHAR KASMIRAN, AIDA MUSTAPHA, NORWATI MUSTAPHA, NOR
ASMA MOHD ZIN |
Abstract: |
Imbalanced datasets are a well-known problem in data mining, where the datasets
are composed of two classes; the majority class and minority class. A majority
class has more instances compared to the minority class. Recent years have
brought increased interest in handling imbalanced datasets since many datasets
produced are naturally imbalanced. Most existing techniques for classifying data
ignore the imbalanced condition, but focused on the accuracy of the model
produced where it is biased to the majority class while giving poor accuracy
towards the minority class. Although the minority class is something that rarely
happens, but in some conditions it will give an important influence to the
classifier model. This paper attempts to list all the techniques in handling
imbalanced datasets, as well as to compare all the techniques for producing the
best classifier model for imbalanced datasets. These techniques have been
categorized into sampling, feature selection and algorithmic approaches in the
form of a taxonomy for handling imbalanced datasets. The strengths and the
weaknesses of these approaches will be discussed in order to identify an
appropriate technique that will improve the performance of a classifier model
produced. The recent trends in handling imbalanced datasets also will be
discussed based on domain and problems exist in dataset. |
Keywords: |
Imbalanced Data, Sampling, Feature Selection, Cost Sensitive Learning,
Classification |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
ENHANCING THE HIDING CAPACITY OF AUDIO STEGANOGRAPHY BASED ON BLOCK MAPPING |
Author: |
AHMED HUSSAIN ALI, MOHD ROSMADI MOKHTAR, LOAY EDWAR GEORGE |
Abstract: |
With the rapid growth in exchanging personal and confidential data through a
unsecure channel like the internet and exposing it though disclosing by
intruders, the necessity of information security became a great demand. As a
result, data hiding or steganography appeared as a vital solution. Audio hiding
is a concept of injecting the secret data in an audio carrier. This paper
proposes a scheme known as ECA-BM, to improve the performance of the audio
steganography. ECA-BM contributes in: (1) increases the hiding capacity, (2)
maintains the transparency of carrier and (3) enhance the security of the
proposed model. To increase the hiding capacity, fractal coding is adopted to
create a mapping between the cover and secret blocks in order to encode the
secret data into a set of coefficients with minimum size. To maintain the
fidelity of the stego file, only 1-LSB from each cover sample is used for
embedding. To increase the security of the ECA-BM, the cover samples for
embedding are selected in a chaotic manner. LSB technique is utilized for
embedding after converting secret coefficients into a binary sequence. Objective
metrics, SNR, HC, and NC is used to evaluate the performance of ECA-BM. The
Experimental results show a significant increase in the hiding capacity compared
with some related studies. Moreover, the fidelity of the stego and reconstructed
secret file are preserved. |
Keywords: |
Fractal Coding, Iteration Function System (IFS), Least Significant Bit (LSB),
Steganography, Chaotic map. |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
PROPOSITION OF A MODEL FOR MULTI-PERIOD WORKFORCE ASSIGNMENT PROBLEM CONSIDERING
VERSATILITY |
Author: |
ABDELHAMID ZAKI, MOHAMMED BENBRAHIM, BAHIA BENCHEKROUN, GHASSANE AYAD |
Abstract: |
Workforce assignment becomes more complex when operators have multiple
competencies and the operators efficiency changes according to the activities
they are assigned to. In this context, only little work has considered the
learning curve effect. In this paper, we will discuss a multi-period assignment
problem, considering the versatility of the operators, which induces a dynamic
view of their competencies and the need to predict changes in individual
performance as a result of successive assignments. We are in a context where the
expected durations and the awaited quality execution of activities are no longer
deterministic, but results from the performance of the operators selected for
their execution. In this article, we will present a mathematical model of this
problem and a genetic algorithm approach to solve the workforce multi-period
allocation problem. |
Keywords: |
Competence, Multi-Skilled Workforce, Individual Competence Level, Versatility,
Multi-Period Assignment Problem, Performance. |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
PROPOSED HYBRID METHOD TO HIDE INFORMATION IN ARABIC TEXT |
Author: |
SUHAD M. KADHEM , DHURGHAM W. MOHAMMED ALI |
Abstract: |
In this method a new proposed approach to hide English texts (Secret data) in
the Arabic text (Covers media). The secret text will be passed through several
steps, Eventually it will be embedded in the cover text. In the proposed coding
step each English character is converted to the binary code through secret
tables that exist on the two sides (sending and receiving), which give us
compression data. The output from the proposed coding will be two parts and
these parts will be input to the next steps.
The next step is Modified RNA Codon (MRNAC) which takes the first part that
result from binary code and returns a stream of binary to be ready for embedding
in embedding step.
After that Modified Run Length Encoding (MRLE) that takes the second part that
results from the proposed coding method and this result always contains a
sequence of ones with fewer zeros, and apply RLE to this result.
The last step is the embedding step using specific Arabic Unicode characters and
non-printed characters to embed the secret information and provide complete
similarity between cover text and stego text since these characters dont appear
when written. |
Keywords: |
Security, Steganography, RNA, Codon, Coding |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
JOINT ENCRYPTION AND WATERMARKING TECHNIQUE USING BLOCK CIPHER AND WAVELET |
Author: |
B.SRIDHAR |
Abstract: |
This paper proposes a joint encryption and watermarking technique based on
random block permutation and DWT with the motivation to enhance the security of
the multimedia content. The original image is sectioned into the blocks and
shuffle the blocks using random permutation, In this technique copyright
information is concealed into an encrypted image. Based on the results,
permutation of blocks is effective in significantly reducing the correlation
thereby reducing the level of perceptual information, whereas the permutation of
blocks is good at producing higher level security. Watermarked crypto image is
freely distributed to channel with enhanced security, because it combines both
encryption as well as watermarking techniques. |
Keywords: |
Copyright Protection, Encryption, Random Permutation, Watermarking, Wavelet
transform
|
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
GENERATING AND EXPANDING OF AN ENCRYPTION KEY BASED ON KNIGHT TOUR PROBLEM |
Author: |
ALI SHAKIR MAHMOOD, MOHD SHAFRY MOHD RAHIM |
Abstract: |
The encryption key considers as a vital part in designing of a cryptosystem.
Whereas these keys must be random as can as possible. The ability to regenerate
the same sequence with small initial value is still a major problem that faces
the designer of encryption key system. The current paper designs a new method of
random number generator with the ability to expand the generated encryption key
to fit the proper image size. The knight tour problem was employed as a random
number generator and used for encryption key expansion. The expansion process
contains two steps, first one generate a random number with (64 x 64) key size
and the second step consider the boundary numbers as from the previous step to
initiate the knight tour as a second time, the second step continue until the
image size was reached. Generated random numbers acquired from the knight tour
problem have been subjected to the NIST 800.22 statistical test and successfully
passed all statistical tests without requiring any additional processing. Per
these results, it has been proved that the proposed system meets the security
requirement and can be used in cryptographic applications. Furthermore, the
knight tour generator provides a small initial value with the ability to
regenerate the same sequence when feed up with the same initial value. |
Keywords: |
Random Number, Encryption Key, Knight Tour, Key Expansion, (NIST) Randomness
Tests |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
TIME COMPLEXITY COMPARISON BETWEEN AFFINITY PROPAGATION ALGORITHMS |
Author: |
R. REFIANTI, A.B. MUTIARA, S. GUNAWAN |
Abstract: |
Affinity Propagation is one of clustering technique that use iterative message
passing and consider all data points as potential exemplars. It is complimented
because providing a good result of clustering with low error rate. But it has
several drawback, such as quadratic computational cost and vague values of
preference. There are many research trying to solve the drawback to improve the
speed and quality of Affinity Propagation. But, there are not any test to find
the best Affinity Propagation expansion algorithm in speed. This has led
researchers to try to compare the performance of several Affinity Propagation
expansion algorithms. The tested algorithms are Adaptive Affinity Propagation,
Partition Affinity Propagation, Landmark Affinity Propagation, and K-AP. There
are two comparison made in this paper: theoretical analysis and running test.
From both comparison, it can be found that Landmark Affinity Propagation has the
most efficient computational cost and the fastest running time, although its
clustering result is very different in number of clusters than Affinity
Propagation |
Keywords: |
Affinity Propagation, Availability, Clustering, Exemplar, Responsibility,
Similarity Matrix |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
PERFORMANCE EVALUATION OF SHADOW DETECTION AND REMOVAL FROM HIGH RESOLUTION
IMAGES USING K-MEANS ALGORITHM AND IOOPL MAPPING ALGORITHM |
Author: |
ARULANANTH T S, ARUL DALTON G |
Abstract: |
Shadow detection and removal is very essential to process the images in the next
level. Generally high resolution color remote sensing images are put it forward
an object oriented shadow detection and removal technique. In this method shadow
features are also taken into consideration during image segmentation and to the
statistical features of the images, suspected shadows are extracted. Moreover
some dark objects which could be mistaken for shadows are ruled out according to
object properties and spatial relationship between objects. So we are
introducing the Inner-Outer Outline Profile Line IOOPL matching is used for
shadow removal. Inner Outer Outline Profile Line (IOOPL) matching obtained with
respect to the boundary lines of shadows. Shadow removal is then performed based
on the homogeneous sections attained through IOOPL. Similarity for matching we
have to extract the Inner and Outer Outline Lines of the boundary of shadows.
Thus grayscale values of the corresponding points of the Inner and Outer Outline
Lines are indicated by the Inner-Outer Outline Profile Lines IOOPL. |
Keywords: |
IOOP, K-Means clustering, Shadow detection, Shadow removal, Reconstruction |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
MULTI CRITERIA SOFTWARE QUALITY ASSESSMENT OF OPEN SOURCE CONTENT MANAGEMENT
SYSTEM |
Author: |
EMA UTAMI, JAMAL |
Abstract: |
The purpose of this study is to compare the quality of software from five web
applications based on open source Content Management System (CMS) e-commerce.
Thus which has the best quality design model is providing recommendations to web
developers, businesses and beginner on building an e-commerce website. This
study uses a web application Prestashop, Magento, Woocommerce, Oscommerce and
Openchart. The measurement of software quality uses traditional metrics and CK
metrics suite parameters. To measure software quality using tools PHP Depend.
Traditional metrics quantitative assessment, CK metrics suite and software
quality factors to get the best quality web applications using a combination of
methods Analytical Hierarchy Process (AHP) and methods Technique for Order
Preference by Similarity to Ideal Solution (TOPSIS). Determination of the
quality of software is based on two main stages, namely the first stage by using
AHP. AHP is used to find the weight of traditional metrics, CK metrics suite and
software quality factors parameters. The second stage uses TOPSIS method. TOPSIS
is used to search final score and ranking. The result of this study indicates
that web applications Prestashop has the best software quality compared with
Woocommerce, Oscommerce, Magento, and Opencart. |
Keywords: |
Traditional Metrics, CK Metrics Suite, Software Quality Factors, AHP, TOPSIS |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
LONG-TERM DEEP LEARNING LOAD FORECASTING BASED ON SOCIAL AND ECONOMIC FACTORS IN
THE KUWAIT REGION |
Author: |
SALMAN ZAKARYA, HALA ABBAS, MOHAMMED BELAL |
Abstract: |
Load forecasting (LF) is a technique used by energy-providing companies to
predict the power needed. LF is of great importance for ensuring sufficient
capacity and manipulating the deregulation of the power industry in many
countries, such as Arab gulf countries. Moreover, reduction of load forecasting
error leads to lower costs and could save billions of dollars. Recently, further
improvement has been introduced using more complex models that take into account
dependencies among hidden layers. Also, many approach based model are presented,
but all of them have limitations prediction capabilities. The purpose of this
work is to demonstrate the load forecasting classes and factors impacting its
performance, especially in Kuwaiti region in Arab Gulf. This work presents a
novel deep leaning model that involves generating more accurate predictions for
the electric load based on hierarchal learning architecture. It is integrates
the features of data in discovering most influent factors affecting electrical
load usage. The dataset used is the actual data from Ministry of Electrical in
Kuwait, the data for load is in mega-watt long-term for the years 2006 to year
2015, which is trained using ARIMA and neural networks models. The load
forecasting is done for the year 2016 and is validated for the accuracy and less
for error rate. Results indicate that this architecture performs quite well when
compared to traditional approaches and deep neural network. |
Keywords: |
Power Electricity; Load forecasting; ARIMA; Regression; Long-term; Prediction;
deep learning |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
SPECTRAL CLUSTER BASED TEMPORAL FEATURE EXTRACTION AND B-TREE INDEXING FOR VIDEO
RETRIEVAL |
Author: |
RENUKADEVI .S , Dr.S.MURUGAPPAN |
Abstract: |
Storage and retrieval of video data is considered as a simple and straight
forwarded task but considered to be trivial when retrieval of information from
video data is concerned. Recently many research works has been developed for
video indexing and retrieval. But, there is a need for effective video indexing
and retrieval method. In order to overcome such limitation, Spectral Cluster
based temporal feature extraction and B Tree indexing (SC-BT) model is proposed
in this paper. The SC-BT model is designed to achieve higher video retrieval
rate and to reduce the video retrieval time. At first, SC-BT model used spectral
clustering algorithm to extracts video features form the collection of video
frames and clustering the video clips in the data set. After that, SC-BT model
used B tree indexing technique to index the clustered video clips in
N-dimensional space with their features with the objective of improving the true
positive rate of video retrieval and reducing the video retrieval time. Finally,
SC-BT model effectively extracts more similar detected video clips based on user
query by evaluating the features observed using co-visibility graph through
spectral clustering are recomputed in all iteration. The performance of SC-BT
model is evaluated with sports repositories data set by using parameters such as
spectral clustering time, spectral clustering accuracy, true positive rate of
video retrieval and video retrieval time. The experimental results show that
SC-BT model is able to improve the true positive rate of video retrieval rate by
12% and also reduces the video retrieval time by 45% when compared to
state-of-the-art works. |
Keywords: |
Video, Indexing, Retrieval, Spectral Clustering, Video Frames, B tree indexing,
User query |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
A HYBRID METHOD OF FEATURE EXTRACTION AND NAIVE BAYES CLASSIFICATION FOR
SPLITTING IDENTIFIERS |
Author: |
NAHLA ALANEE, MASRAH AZRIFAH AZMI MURAD |
Abstract: |
Nowadays integrating natural language processing techniques on software systems
has caught many researchers attentions. Such integration can be represented by
analyzing the morphology of the source code in order to gain meaningful
information. Feature location is the process of identifying specific portions of
the source code. One of the most important information lies on such source code
is the identifiers (e.g. Student). Unlike the traditional text processing, the
identifiers in the source code is formed as multi-word such as Employee-Name.
Such multi-words are not divided using white space, instead it can be formed
using special characters (e.g. Employee_ID), CamelCase (e.g. EmployeeName) or
using abbreviations (e.g. EmpNm). This makes the process of extracting such
identifiers more challenging. Several approaches have been performed to resolve
the problem of splitting multi-word identifiers. However, there is still room
for improvement in terms of accuracy. Such improvement can be represented by
utilizing more robust features that have the ability to analyses the morphology
of identifiers. Therefore, this study aims to propose a hybrid method of feature
extraction and Naïve Bayes classifier in order to separate multi-word
identifiers within source code. The dataset that has been used in this study is
a benchmark-annotated data that contains large number of Java codes. Multiple
experiments have been conducted in order to evaluate the proposed features
independently and with combinations. Results shown that the combination of all
features have obtained the best accuracy by achieving 64.7% of f-measure. Such
finding implies the usefulness of the proposed features in terms of
discriminating multi-word identifiers. |
Keywords: |
Feature Location, Split Identifiers, Feature Extraction, Naive Bayes, Source
Code |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
FPGA-BASED HIGH SPEED BLOWFISH ALGORITHM |
Author: |
SOUFIANE OUKILI, SEDDIK BRI |
Abstract: |
Nowadays, security has become essential element of all systems and applications,
due to the rapid growth of information and communication technology. In this
context, high speed and high volume secure communications have been a high
priority and challenging research area in both fields of mathematics and
engineering. In this paper, we present high speed hardware architecture of
Blowfish cryptographic algorithm. We had used pipeline technique to allow a
parallel processing in order to obtain high throughput. In addition, 5-stage
pipeline round of Blowfish algorithm is proposed to increase the speed and the
maximum operating frequency. Furthermore, the S-box tables of each round of the
algorithm had been implemented in block RAMs to allow parallel data encryption.
The proposed design had been successfully implemented in FPGA devices. It
improves data throughput by 104%. |
Keywords: |
Security, Cryptography, Blowfish, Pipeline, High speed; FPGA |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
Title: |
SEGMENTATION OF CORONARY ARTERIAL TREE USING LOCALIZED DEFORMABLE MODEL EMBEDDED
WITH AUTOMATED SEEDS |
Author: |
SAMMER ZAI, MUHAMMAD AHSAN, YOUNG SHIK MOON |
Abstract: |
This paper presents a fully automatic approach for isolating the left and right
coronary arteries from CTA images by embedding our improved fast seed detection
method into localized active contour model. Usually active contour based methods
require starting point known as seed for their evolution. Accurate provision of
this seed point leads to the accurate segmentation. Manual feeding of seed point
requires expertise as well as may lead to wrong segmentation. Therefore, in this
paper we have combined the quantile and median based thresholded Hessian-based
vesselness with that of local geometric features of the vessel to detect the
coronary seed points accurately in an automatic fashion. Further, the detected
seed points are fed to the active contour model which evolves in a localized way
to track the entire coronary arteries to their distal ends. The obtained seed
points as well as the obtained segmented left and right coronary arteries are
verified by the radiologist at each step. The method is evaluated and validated
on nine real clinical CTA datasets and also compared with the previous methods
proposed by Lankton et. al and Khedmati et. al.. Experimental results reveal
that the proposed method outperforms the previous methods qualitatively as well
as quantitatively. |
Keywords: |
Computed Tomography Angiography, Coronary arteries, Hessian-based vesselness,
Coronary Artery Disease, Deformable Model. |
Source: |
Journal of Theoretical and Applied Information Technology
15th April 2017 -- Vol. 95. No. 7 -- 2017 |
Full
Text |
|
|
|