|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Information Technology
April 2018 | Vol. 96
No. 8 |
Title: |
OFFLINE HANDWRITTEN SIGNATURE RECOGNITION USING HISTOGRAM ORIENTATION GRADIENT
AND SUPPORT VECTOR MACHINE |
Author: |
NIDAA HASAN ABBAS, KHALED N. YASEN, KAMARAN HAMA ALI FARAJ, LWAY FAISAL A.RAZAK,
FAHAD LAYTH MALALLAH |
Abstract: |
Human being authentication by offline handwritten signature biometric research
has been increasing, especially in the last decade. Verification process of an
offline handwritten signature is not trivial task, because an individual rarely
signs exactly the same signature whenever he/she signs, which is referred to as
intra-user variability. The objective of this paper is proposing a feature
vector of an offline handwritten signature by using an efficient algorithm as a
strong feature extraction namely Histogram Orientation Gradient (HOG), in order
to be passed into Support Vector Machine (SVM) classifier for the recognition
operation. An experiment has been conducted to estimate the accuracy and
performance of the proposed algorithm by using SIGMA database, which has more
than 6,000 genuine and 2,000 forged signature samples taken from 200
individuals. The result has given accuracy as 96.8% as successful rate coming
from the error type as: False Accept Rate (FAR) is 3% and False Reject Rate
(FRR) is 3.35%. |
Keywords: |
Offline Signature, Biometrics, verification, Histogram Oriented Gradient (HOG),
Support Vector Machine (SVM) |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
PEDESTRIAN USING CATADIOPTRIC SENSOR |
Author: |
BOUI MAROUANE, HADJ-ABDELKADER HICHAM, ABABSA FAKHR-EDDINE, ABOUYAKHF EL
HOUSSINE |
Abstract: |
We investigate the detection of person in the omnidirectional images, adopting a
linear SVM. We have implemented HOG-based descriptors, for omnidirectional and
spherical images. In this paper we studied the influence of each parameter in
our algorithm on the performances of person detections in catadioptric images.
However, few studies have elaborated the problem of human detection using this
type of cameras; therefore we have set up our own test base. Our results show
that our detector can robustly detect people in omnidirectional images, as soon
as the algorithm is adapted to the distortions introduced by the use of the
omnidirectional camera. |
Keywords: |
Omnidirectional Sensor, HOG, Human Detection, Spherical Images Spherical Images
Spherical Images. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
HUMAN FACTORS FOR IOT SERVICES UTILIZATION FOR HEALTH INFORMATION EXCHANGE |
Author: |
MOHAMMED AHMED DAUWED; JAMAIAH YAHAYA; ZULKEFLI MANSOR; ABDUL RAZAK HAMDAN |
Abstract: |
Currently, the exchange of patient information continues to be a challenge. The
growing demand for health care makes it necessary to enhance the exchange of
health-related information efficiently. IoT makes data available easily to
exchange health-related information for health professionals’. Internet of
Things (IoT) services can improve the quality of life and help health care
professionals in their decision-making. Health records can be exchanged easily
through the IoT network. The IoT is growing technology to integrate all smart
devices, resources, and systems to discover drugs, treatments, and health
records of patients in one network. Despite the advantages of this technology,
there are a lot of challenges facing the healthcare organizations to utilize it
especially in the context of developing countries. As such, the researchers
carried out the literature survey for the related information in order to
investigate the current issues and the factors that affect medical professionals
and IT practitioners in order to use IoT in health information exchange. The
purpose of this study is to help researchers and practitioners to develop the
models for utilizing the IoT in health information exchange among healthcare
providers. The researchers found the main critical factors to be, intention to
use, user satisfaction, collaboration environment, trust, efforts, and service
quality, which must be taken into consideration. From the review, a number of
critical factors were found to be essential in the utilization of IoT. |
Keywords: |
Internet of Things; Health Information Exchange, Healthcare System; Utilization
IoT; Human Factors. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
A NOVEL ALGORITHM TO REDUCE PEAK-TO-AVERAGE POWER RATIO OF ORTHOGONAL FREQUENCY
DIVISION MULTIPLEXING SIGNALS |
Author: |
MANZOOR AHMED HASHMANI, FARZANA RAUF ABRO, MUKHTIAR ALI UNAR, JUNZO WATADA |
Abstract: |
It has been observed over last several decades that bandwidth greediness of
applications never gets fulfilled. Hence, scientists, researchers, and engineers
keep working on new ways of providing higher bandwidth. Recently, a new
modulation technique called Orthogonal Frequency Division Multiplexing (OFDM)
has been introduced which provides very high data rates. In OFDM the high
frequency input signal is modulated over a large number of low frequency
sub-carrier signals which are orthogonal to each other. This feature makes it
very robust against efficiency degradation at higher frequencies. That is the
reason why OFDM is a choice for the modern high and ultra-high data rate
communication systems. However, it suffers from high levels of the peak power to
the average power also called Peak-to-Average Power Ratio or PAPR. Reducing PAPR
in OFDM is a hot research area. There are many schemes available which attempt
to reduce PAPR. Some are in fact able to reduce PAPR but not sufficient enough
to make these feasible. Others do reduce it but increase its complexity to an
extent that these become unfeasible to realize. From literature it has been
identified that SLM performs better than other methods in terms of computational
complexity at the same performance level. The main motivation behind this
research effort is to find a mechanism which reduces PAPR for OFDM systems and
has a reasonable level of complexity so that it may be realizable. As an outcome
of this research activity, a novel framework based on Artificial Neural Networks
(ANN) and Selective Mapping (SLM) is proposed. The kernel used by the ANN in
proposed framework is a modified version (proposed by us) of an already
available kernel called Novel Kernel Based – Radial Basis Function (NKB-RBF). We
show through simulations results that our proposed kernel, Modified NKB-RBF
(MNKB-RBF), is more efficient than NKB-RBF and gives better results in selection
of low frequency sub-carriers with lowest PAPR. |
Keywords: |
Orthogonal Frequency Division Multiplexing (OFDM), Peak-to-Average Power
Ratio (PAPR), Selective Mapping (SLM), Artificial Neural Networks (ANN), Radial
Basis Function (RBF) |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
INTRUSION DETECTION SYSTEM USING BOOTSTRAP RESAMPLING APPROACH OF T2 CONTROL
CHART BASED ON SUCCESSIVE DIFFERENCE COVARIANCE MATRIX |
Author: |
MUHAMMAD AHSAN, MUHAMMAD MASHURI, HIDAYATUL KHUSNA |
Abstract: |
The multivariate control chart is one of SPC method that is often used in
intrusion detection. The Hotelling’s T2 control chart with Successive Difference
Covariance Matrix (SDCM) is the robust method that can detect outliers in the
process data for individual observation. This method will effective to be
applied in Intrusion Detection System (IDS) because it can detect the anomaly or
outliers in the network. The problem arise when the exact distribution of this
method has not determined. Bootstrap is one of the nonparametric method that
widely used to estimate the parameter without any distribution assumption
applied to overcome the problem. In this research, the Hotelling’s T2 control
chart is improved using the SDCM while its control limits is calculated using
bootstrap resampling method. The proposed method is applied in IDS and its
performance is compared to the other control chart approaches. The performance
evaluation result shows that the proposed IDS with bootstrap control limit
performs better than the other control chart approaches for testing dataset.
Moreover, the proposed IDS outperforms the other classification methods. |
Keywords: |
T2 Control Chart, Successive Difference Covariance Matrix, Kernel Density
Estimation, Bootstrap, Intrusion Detection System |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
MACHINE LEARNING CONFIGURATIONS FOR HUMAN PROTEIN CLASSIFICATION USING SDFES |
Author: |
SUNNY SHARMA , AMRITPAL SINGH , GURVINDER SINGH , RAJINDER SINGH |
Abstract: |
The identification of target proteins for diseased condition yields the
development of the disease detection recommender system and drug discovery
processes whose reticence can demolish the pathogen. The testing of this drug
discovery is done through clinical and in addition through pre-clinical
observations first on the creatures then on people. Thereafter the discovered
drug is ready for public use. But if the drug discovery testing phase does not
show the suitable consequences, then the entire task must be repeated. This
repetitive clinical as well as the preclinical experimentation task is very
cumbersome. But keeping in view the importance of the disease detection and drug
discovery phase in protein identification as well as in the protein
classification process this task must be done by researchers. The advancements
in computational biology reveal the importance of computational prediction of
protein function or to identify the target on the basis of protein sequence
extracted features. To accurately predict the human protein functionalities,
lots of approaches are incorporated but this is a very cumbersome task due to
the large and versatile nature of the domain. The present work will help to do
this job through computational prediction. This paper involves the development
of a model which use associative rule mining to extract the sequence derived
features at a single platform (SDFES-Sequence derived feature extraction server)
from the given human protein sequence and then critically analyzed with machine
learning (ML) approaches under the aegis of data analysis tool WEKA. The new
sequence derived features are identified and incorporated in the data set, and
the scopes of ML approaches were examined for effective prediction. The
important configuration incorporation and their configured comparison of
approaches are completed to accomplish higher accuracy. In addition to
comparative analysis, the limitation of ML approach is discussed along with its
remedies by changing the configurations. The proposed work will assist to derive
the sequence extracted feature together at a single place and further predict
the class or function of the protein which leads to the innovation in drug
discovery and disease detection recommender systems. |
Keywords: |
Protein, Machine Learning, WEKA, Random Forest, Decision Tree. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
AODV CLUSTERING ALGORITHM BASED ON DENSITY AND NODES MOBILITY WITH A MOBILE
REFERENCE |
Author: |
M. SAADOUNE, M. DYABI, A. HAJAMI, H. ALLALI |
Abstract: |
A mobile ad hoc network (MANET) is generally defined as a network that has many
free nodes, often composed of mobile devices, thus the mobility of the nodes
effects a much the performance of the network. AODV is a routing protocol for
mobile ad hoc networks (MANET), it has low processing and memory overhead and
low network utilization, and works well even in high mobility situation, but it
has issues for large mobile networks. A clustering architecture provides
network scalability and fault tolerance, and results in more efficient use of
network resources. For those reasons, we propose in this paper two kind of
clustering algorithm in AODV: Density-based algorithm and mobility-based
algorithm using a mobile reference. Our objective is to elect a reduced number
of less mobile cluster heads. |
Keywords: |
Ad hoc, Mobility, Localization, Distance, Relative speed, AODV, Metric,
Clustering |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
COMBINATION MATHEMATICAL DISTANCE MEASURE APPROACH FOR SOME IMAGE PROCESSING
APPLICATIONS |
Author: |
SHAHAD ADIL TAHER, HIND RUSTUM MOHAMMED |
Abstract: |
In this paper the researcher presents a Combination mathematical Distance
Measure approach for Image Encryption; it had been applied on different images .
The aim of this method to protect the images by encryption them. The proposed
method is depending on find a new distance between image pixels by using
interpolation. The study data base consists of (40 images) of different types
(color images and gray images) with different formats(. jpg, .bmp, .tif , .gif,
. png ). The Proposed method gave more accurate results, stronger encoding of
images and high encryption efficiency via using the performance evaluation
factors to evaluate labor standards and proved the successfulness of this high
encryption ratios method of. MATLAB® 2010 software had been highly relied in
this study. |
Keywords: |
Image Processing, Image Encryption, Distance Measure, Performance Analyses Image |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
SYSTEM OF DECISION SUPPORT IN WEAKLY-FORMALIZED PROBLEMS OF TRANSPORT
CYBERSECURITY ENSURING |
Author: |
AKHMETOV B., LAKHNO V. |
Abstract: |
This paper resolves the actual task of the development of mathematical software
decision support systems (DSS) cyber security mission-critical information
systems of transport (CRIST) in poorly structured and difficult tasks of
formalizing the information security and information risk assessment. The paper
presents developed system for decision support in weakly formalized problems of
CRIST and the cyber security of objects of Informatization of the industry. The
system is based on models of information security tasks description, risk
assessment and cyber defense of transport in conceptual and functional aspects.
Also the article presents the description of the process of forming the DSS
knowledge base for circumstances related to the identification of
hard-to-explain signs of anomalies and attacks. |
Keywords: |
System Of Support Of Decision-Making, Cyber Security, Poorly Formalized Tasks,
The Interpretation Of The Situation. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
THE IDENTIFICATION OF DATA ANOMALIES FROM INFORMATION SENSORS BASED ON THE
ESTIMATION OF THE CORRELATION DIMENSION OF THE TIME SERIES ATTRACTOR IN
SITUATIONAL MANAGEMENT SYSTEMS |
Author: |
FARIZA BILYALOVNA TEBUEVA , VLADIMIR VYACHESLAVOVICH KOPYTOV , VIACHESLAV
IVANOVICH PETRENKO , ANDREY OLEGOVICH SHULGIN, NIKITA GEORGIEVICH DEMIRTCHEV5 |
Abstract: |
Purpose: The goal is the timely detection of uncharacteristic behavior of the
observed processes in the systems of situational management, leading to the
development or occurrence of emergency situations. Methodological approach:
In the article, it is proposed to analyze the change dynamics in the correlation
dimension of the attractor in order to detect anomalies in the behavior of the
observed process. A sharp change in the correlation dimension is a reflection of
the uncharacteristic (anomalous) nature of the data of the observed processes.
This anomaly is a consequence of external influences on the generating system
and requires an analysis of the causes of its occurrence. Uniqueness/value:
The uniqueness of the proposed approach consists in the fact that an abrupt
change in the correlation dimension of the attractor is the information about
the occurrence of uncharacteristic behavior of the observed system. The value of
the study is determined by the relevance of the problem of modeling the
development and occurrence of emergency situations in situational management
systems based on the analysis of the time series of observed processes.
Summary: The proposed approach is designed to identify the critical states of
the generating dynamical systems by their time series. Timely response to the
transition of the monitored system to a critical state will allow preventing any
critical consequences. |
Keywords: |
Correlation dimension, Attractor, Anomalous behavior, Data of information
sensors. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
WHALE OPTIMIZATION ALGORITHM FOR SOLVING THE MAXIMUM FLOW PROBLEM |
Author: |
RAJA MASADEH , ABDULLAH ALZAQEBAH , AHMAD SHARIEH |
Abstract: |
Maximum Flow Problem (MFP) is deemed as one of several well-known basic problems
in weighted direct graphs [9]. Moreover, it can be applied to many applications
in computer engineering and computer science. This problem is solved by many
techniques. Thus, this study presents a possible solution to the max flow
problem (MFP) using a Whale Optimization algorithm (WOA), which is considered as
one of the most recent optimization algorithms that was suggested in 2016. The
experimental results of the “MaxFlow-WO” algorithm that were tested on various
datasets are good evidence that the s technique can solve the MFP and reinforce
its performance. It aims to solve the MFP by clustering the search space to find
the MF for each cluster (local MF) in order to find the overall solution (global
MF) of the desired graph. The WOA is used to solve the MFP by supposing the
graph is the search space that the whales looking to reach the prey. Here, the
prey is the sink in the network (represented by the graph) and other whales are
represented in other nodes in the graph. |
Keywords: |
Whale Optimization, Maximum Flow Problem, Maximum Flow, Meta-Heuristic,
Optimization |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
EVALUATING THE INFLUENCE OF FEATURE SELECTION TECHNIQUES ON MULTI-LABEL TEXT
CLASSIFICATION METHODS USING MEKA |
Author: |
SUSAN KOSHY, DR. R.PADMAJAVALLI |
Abstract: |
Multi-label classification has generated a lot of interest with its useful
applications in real world situations as against traditional single label
classification. Feature selection has a positive affect on the performance of
multi-label classification as it elevates the performance of learning
algorithms, reducing the storage requirement and complexity of the
multidimensional space. There are many multi label algorithms to handle the
problem of multi-label classification and the issue of dimensionality is
overcome by feature selection. There are several feature selection techniques
and the right combination of multi-label classification and multi-label feature
selection will help in building an efficient model of classification for a given
dataset. This paper uses the available algorithms and evaluates the influence of
filter feature selection methods and multi-label classification for two standard
text datasets drawn from real domains. Multi-label Problem transformation
transform the multi-label dataset into single label. Binary Relevance,
Classifier Chains, Pruned Sets and an ensemble method called RAkEL, multi-label
classifiers with two single label classifier namely J48 and Naďve base are used.
Feature selection is followed by multi-label classification. This paper uses
five standard techniques namely correlation feature subset selection,
correlation feature selection, gain ratio, information gain and ReliefF to
evaluate the relationship between feature selection, multi-label classification
and single label base classifier in order to obtain enhanced multi-label
evaluation metrics |
Keywords: |
Correlation Based Feature Subset Selection, Correlation Feature Selection,
Multi-Label Text Classification, Gain Ratio, Information Gain, Relieff. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
SEMANTIC ROLE LABELING OF MALAYALAM WEB DOCUMENTS IN CRICKET DOMAIN |
Author: |
SUNITHA C, DR. A JAYA, AMAL GANESH |
Abstract: |
Document Summarization is an ongoing research work in the field of Natural
Language Processing which will provide a summary which is almost like a summary
generated by a human being with the help of NLP tools and techniques. Since the
information used across the digital world is exponentially increasing, automatic
summarization techniques gained attention especially abstractive methods. But
producing an effective abstractive summary, first, the text documents should be
represented semantically. From this representation, important sentences must be
selected using some strategies and finally the abstractive summary is generated.
Representing the sentences in natural language semantically faces many
challenges. Various works have been carried out for extracting the semantics of
the sentences. Semantic role labeling is a technique in NLP to detect the
semantically related arguments of a predicate or verb in a sentence and their
grouping into one of the related roles. So this technique can be used to
represent the sentences meaningfully and can be further used in different
applications such as question answering system, information extraction,
summarization, text categorization etc. Currently, limited works are done in
Malayalam towards semantic role extraction. Domain based works will give better
results. In this paper the semantic roles of important words in Malayalam Web
documents pertaining to cricket domain are identified. |
Keywords: |
Semantic Role Labeling, Karaka relations, Memory Based Learning, Vibhakthi,
Chunking |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
SFT: A MODEL FOR SENTIMENT CLASSIFICATION USING SUPERVISED METHODS ON TWITTER |
Author: |
RAZIEH ASGARNEZHAD, S. AMIRHASSAN MONADJEMI, MOHAMMADREZA SOLTANAGHAEI, AYOUB
BAGHERI |
Abstract: |
Twitter Sentiment Classification is one of the most popular fields in
information retrieval and text mining. Thousand of millions of people around the
world intensity use web sites such as Twitter. Twitter, as a micro-blogging
system, allows users to publish tweets to tell others what they are thinking. In
fact, there are already many web sites built on the Internet providing a Twitter
sentiment search service. In those web sites, the user can input a sentiment
target and in searching for tweets containing positive and negative sentiments.
As a result of the increasing number of tweets over the past few years, tweets
have attracted more and more attention. This is striking for consumers to
research the sentiment of products before purchase automatically. This paper
proposes a novel model for Twitter Sentiment Classification. The purpose of this
model is investigating what is the role of weighting feature techniques in
Sentiment Classification using supervised methods on the Twitter data set. Also,
it explores binary classification which is classified data set into positive and
negative classes. It is shown that usage of the proposed model can improve 7%
the accuracy of Twitter Sentiment Classification. The results confirmed the
superiority of the proposed model over the state-of-the-art systems. |
Keywords: |
Sentiment Classification, Support Vector Machine, Supervised Method, Twitter |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
MULTI OBJECTIVE INTEGRATED CROSSOVER BASED PARTICLE SWARM OPTIMIZATION FOR LOAD
BALANCING IN CLOUD DATA CENTERS |
Author: |
NEHA SETHI, SURJIT SINGH, GURVINDER SINGH |
Abstract: |
Cloud computing is becoming popular day by day, due to its wide range of
scalable and dynamic characteristics. The increase in number of cloud users
leads to the imbalance in resources and cloud data centers utilization and
drastically improves the energy consumption. Therefore, it increases the data
centers cost and waiting time of cloud users. Therefore, improving the waiting
time and optimum usage of cloud resources has become a challenging issue. Many
approaches have been designed to balance the user loads between cloud data
centers. However, majority of existing load balancing approaches suffer from
pre-mature convergence issue, stuck in local optima issue and poor convergence
issue. To overcome this issue, in this paper a multi-objective integrated
crossover based particle swarm optimization has been proposed for balancing the
load between cloud data centers. The proposed technique has been designed and
implemented in MATLAB 2013a tool with the help of parallel processing toolbox
Extensive analysis reveal that the proposed technique outperforms existing
techniques in terms of average waiting time, makes pan and degree of imbalance.
Analysis of Variance (ANOVA) based statistical testing has also been utilized to
evaluate the significant improvement of the proposed technique. |
Keywords: |
Particle Swarm Optimization, Meta heuristics, Cloud data centers, Load
balancing, Analysis of Variance. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
TRANSMISSION QUALITY AWARE OPTIMUM CHANNEL SCHEDULING (QAOCS) FOR 802.11A WLAN |
Author: |
MACHA SARADA, DR. AVULA DAMODARAM |
Abstract: |
WLAN networks have emerged as the most deployed last stage component of internet
connectivity to mobile users. Growing subscribers of WLANs for Wi-Fi accessing
devices is driving the traffic in WLANs. Accordingly, WLAN services are prone to
severe declines in performance amid channel interference and contention. To
address rising traffic issues and performance degradation, a novel channel
scheduling strategy is proposed. Unlike the most of the contemporary models, the
proposed model schedules the channels based on multi-objective QoS factors,
moreover it balances the load by transmitting buffered data packets as
transmission-window. The contention state of channel availability also addressed
in this proposal. The research work depicts that the recommended algorithm
signifies the improved throughput, defused drop ratio, and also distributes the
user traffic based on optimizing channel scheduling. |
Keywords: |
WLAN, 802.11a, 802.11ac, Channel scheduling, Access point, OCA-ITU. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
AN APPROACH FOR DETECTING SYNTAX AND SYNTACTIC AMBIGUITY IN SOFTWARE
REQUIREMENT SPECIFICATION |
Author: |
ALI OLOW JIM’ALE SABRIYE, WAN MOHD NAZMEE WAN ZAINON |
Abstract: |
Software requirements are considered to be ambiguous if the requirements
statement could have more than one interpretation. The ambiguous requirements
could cause the software developers to develop software which is different from
what the customer needs. The focus of this paper is to propose an approach to
detect syntax and syntactic ambiguity in software requirements specification. In
this paper, Parts of speech (POS) tagging technique has been used to detect
these ambiguities. A prototype tool has been developed in order to evaluate the
proposed approach. The evaluation is done by comparing the detection
capabilities of the proposed tool against human capabilities. The overall
results show that the humans do have some difficulties in detecting ambiguity in
software requirements, especially the syntactic ambiguity and software
requirements that contains both syntax and syntactic ambiguity in one sentence.
The proposed tool can definitely help the analyst in detecting ambiguity in
Software requirements. |
Keywords: |
Part of speech tagging, Syntax ambiguity, Syntactic ambiguity, Software
requirements specification. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
AN EFFICIENT ARTIFICIAL FISH SWARM ALGORITHM WITH HARMONY SEARCH FOR SCHEDULING
IN FLEXIBLE JOB-SHOP PROBLEM |
Author: |
ISHRAQ F. FAEQ, MEHDI G. DUAIMI, AHMED T. SADIQ AL-OBAIDI |
Abstract: |
Flexible job-shop scheduling problem (FJSP) is one of the instances in flexible
manufacturing systems. It is considered as a very complex to control. Hence
generating a control system for this problem domain is difficult. FJSP inherits
the job-shop scheduling problem characteristics. It has an additional decision
level to the sequencing one which allows the operations to be processed on any
machine among a set of available machines at a facility. In this article, we
present Artificial Fish Swarm Algorithm with Harmony Search for solving the
flexible job shop scheduling problem. It is based on the new harmony improvised
from results obtained by artificial fish swarm algorithm. This improvised
solution is sent to comparison to an overall best solution. When it is the
better one, it replaces with the artificial fish swarm solution from which this
solution was improvised. Meanwhile the best improvised solutions are carried
over to the Harmony Memory. The objective is to minimize a total completion time
(makespan) and to make the proposed approach as a portion of the expert and the
intelligent scheduling system for remanufacturing decision support. Harmony
search algorithm has demonstrated to be efficient, simple and strong
optimization algorithm. The ability of exploration in any optimization algorithm
is one of the key points. The obtained optimization results show that the
proposed algorithm provides better exploitation ability and enjoys fast
convergence to the optimum solution. As well, comparisons with the original
artificial fish swarm algorithm demonstrate improved efficiency. |
Keywords: |
Artificial Fish Swarm Algorithm; Harmony Search; Makespan; Flexible Job-Shop
Scheduling Problem. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
IMAGE PREPROCESSING OF ABDOMINAL CT SCAN TO IMPROVE VISIBILITY OF ANY LESIONS IN
KIDNEYS |
Author: |
HIMA BINDU G, PRASAD REDDY PVGD, M. RAMAKRISHNA MURTY |
Abstract: |
Abdominal CT scan images are widely used in detection of kidney lesions. This
paper study is conducted for pre-processing abdominal CT scan images so as to
segment the kidney for further analysis of lesion detection. Various noise
filters and segmentation techniques have been experimented to select the best
filter and segmentation techniques for pre-processing the CT image. The
experimental study finds a combination of Median filter followed by Wiener
filter more effective to remove different noises present in the CT images.
Different segmentation techniques have been run on the test data set of CT
images and it is observed that Edge based active contour produced better results
than Graph Cut and region based active contours. |
Keywords: |
Medical Imaging, Noise Filters, Segmentation, Active Contour, Region-based,
Edge-based, Graph Cut. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
IMAGE TAMPER DETECTION AND RECOVERY USING LIFTING SCHEME-BASED FRAGILE
WATERMARKING |
Author: |
TAHA BASHEER TAHA, RUZELITA NGADIRAN, PHAKLEN EHKAN, , MOHAMAD T. SULTAN |
Abstract: |
High prevalence of digital images imposed a great interest in the process of
authority and integrity investigation. In many cases, high accuracy image tamper
detection techniques involves with high computational overhead in addition to
the complexity produced from recovering original image process. In this work,
blind image tamper detection and self-recovery is presented using Lifting Scheme
which characterized by simplicity and integer based calculations and LSB
modification. Experimental results reveal that proposed model performs well in
terms of detection and recovery for different types of tampering as removing and
cloning. Furthermore, produced watermarked images have very accepted perceptual
quality in terms of subjective and objective evaluations. |
Keywords: |
Lifting Scheme, LWT, Image Tamper Detection, Image Recovery, Fragile
Watermarking. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
A CHAOS WITH DISCRETE MULTI-OBJECTIVE PARTICLE SWARM OPTIMIZATION FOR PAVEMENT
MAINTENANCE |
Author: |
KAWTHER AHMED , BELAL AL-KHATEEB , MAHER MAHMOOD |
Abstract: |
Particle Swarm Optimization (PSO) is a very popular technique in swarm
intelligence. PSO has been applied to solve many problems that have single or
multi-objectives. In fact, the multi-objectives optimization problems in real
life are combinatorial in nature. Therefore, PSO has been developed to be able
to handle large number of decision variables and reduce computational
complexity. In this paper, a chaos multi objective PSO algorithm is developed
for solving discrete (binary) optimization problems. The developed algorithm is
applied to pavement management problem to find optimal maintenance and
rehabilitation plan for flexible pavement with maximum pavement conditions and
minimum maintenance cost. The results show that there is significant improvement
in the solutions satisfying pavement conditions and maintenance cost objectives.
It is required to a very short time of execution by the developed algorithm to
reach a very good solution. In addition, it is found that it is able to converge
to the solution faster than another PSO algorithm. |
Keywords: |
Multi-Objective Optimization, Pavement Maintenance, Particle Swarm Optimization,
Chaotic Mapping, Binary PSO. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
ARABIC NEWS CREDIBILITY ON TWITTER: AN ENHANCED MODEL USING HYBRID FEATURES |
Author: |
SAHAR F. SABBEH , SUMAIA Y. BAATWAH |
Abstract: |
Recently, social media and specially Twitter has become a main source for news
consumption and sharing among millions of users. Those platforms enable users to
author, publish and share content. Such environments can be used to publish and
spread rumors and fake news whether unintentionally or even maliciously. That is
why credibility of information in such platforms has been increasingly
investigated in many domains (i.e. information sciences, psychology,
sociology...etc). This paper proposes a machine learning - based model for
Arabic news credibility assessment on Twitter. It uses hybrid set of features
that are topic and user related to evaluate news credibility. In addition to the
traditional content-related features, Content verifiability and users' replies
polarity analysis used for a more accurate assessment. The proposed model
consists of four main modules: a) content parsing and features extraction
module, b) content verification module, c) users’ comments polarity evaluation
and d) credibility classification module. A data set of 800 Arabic news that are
manually labeled is collected from Twitter. Three different classification
techniques were applied (Decision tree, support vector machine (SVM) and Naive
Bayesian(NB). For model training and testing, 5-fold cross validations were
performed and performance diagnostics were calculated. Results indicate that
decision tree achieves TRP higher than SVM by around 2% and 7% than NB, also FPR
almost 9% lower than SVM and 10% lower than NB. For precision,recall, f-measure
and accuracy, decision tree achieves almost 2% higher than SVM and 7% higher
than NB for the tested data-set. Experiments also revealed that the proposed
system achieves accuracy that outperforms the system proposed by Hend.et.al [29]
and TweetCred [2]. |
Keywords: |
News Credibility, Arabic News, Machine Learning, Twitter, Verifiability, Text
Polarity |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
SYSTEM COUPLING AND COHESION REQUIREMENTS MODEL (SC2RM): MEASUREMENT APPROACH
FOR REAL TIME SYSTEM |
Author: |
KHALED ALMAKADMEH |
Abstract: |
One of the main challenges for software development organizations is to build
software systems with measured complexity. Monitoring a software system
complexity help software engineers in development phases of system development
life cycle, such as software system reusability and software system
maintainability. A key measure of software complexity is the degree of cohesion
and coupling within and between its components. The literature emphasizes that a
key system element to measure the degree of cohesion and coupling is the number
of interactions between software components. This paper propose a new model to
measure the degree of cohesion and coupling within and between real-time system
components based on ISO19761 international standard. A case study is conducted
to verify the applicability of the proposed measurement model using structural
specifications and First Class Relation. The resulting measures are valuable
indicators of a software system complexity that directly affects its reusability
and maintainability. |
Keywords: |
Coupling; Cohesion; Software Measurement; Real-time System, ISO19761. |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
COMPUTER-BASED PORTFOLIO ASSESSMENT TO ENHANCE STUDENTS’ SELF-REGULATED LEARNING |
Author: |
GUSTI AYU MAHAYUKTI, NYOMAN DANTES, 3I MADE CANDIASA, ANAK AGUNG ISTRI NGURAH
MARHAENI, I NYOMAN GITA, DEWA GEDE HENDRA DIVAYANA |
Abstract: |
The aim of this study is to elaborate the impact of computer-based portfolio
assessment to improve the students’ self-regulated learning. The population of
this research was the students of Mathematics Education Department, Universitas
Pendidikan Ganesha, who followed the Integral Calculus course in academic year
2015/2016. From those, 88 students were chosen randomly as the sample.
Pretest-Posttest Control Group Design was employed as the method of the present
study. The data were gathered from a self-regulated learning questionnaire with
19 positive and 18 negative statements. The collected data were analyze using
descriptive and inferential techniques using t-test. The result showed that
there is an improvement of the students’ self regulated learning after following
the course with computer-based portfolio assessment. The finding implies that
when using computer-based portfolio assessment in the classroom, it is better in
team teaching format. |
Keywords: |
Computer-Based Portfolio Assessment, Self-Regulated Learning |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
NON-CONVEX ECONOMIC LOAD DISPATCH PROBLEMS USING NOVEL BAT ALGORITHM |
Author: |
HARDIANSYAH |
Abstract: |
In this paper a novel bat algorithm (NBA) is proposed for solving non-convex
economic load dispatch (ELD) problems so as to minimize the total generation
cost when considering the linear and non linear constraints. The proposed
algorithm combines the bats’ habitat selection and their self-adaptive
compensation for Doppler effects in echoes into the basic bat algorithm (BA).
The selection of bats’ habitat is modeled as the selection between their quantum
behaviors and mechanical behaviors. Many nonlinear characteristics of the power
generators and practical constraints, such as power loss, ramp rate limits,
prohibited operating zones and valve-point effects, are considered. The
effectiveness and feasibility of the proposed method are demonstrated by two
real power systems and compared with other optimization algorithms reported in
literature. |
Keywords: |
Novel Bat Algorithm, Non-Convex Economic Load Dispatch, Ramp Rate Limits,
Prohibited Operating Zones, Valve-Point Effects |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
Title: |
AN EFFICIENT DIGITAL VIDEO WATERMARKING SYSTEM ROBUST AGAINST VARIOUS SPATIAL,
TEMPORAL, and SPATIO-TEMPORAL ATTACKS |
Author: |
MANISH K THAKUR |
Abstract: |
For many years information hiding techniques are playing major roles in
achieving robustness against various malicious attacks on digital multimedia
data of different application areas. The induction of digital watermarking has
given these practices a new way in the field of information hiding to prevent
the copyrighted multimedia data against various attacks. In this paper, we
propose a generalised watermarking system which efficiently identifies various
attacks in spatial, temporal, and spatio-temporal domain, viz. frame deletion,
frame swapping, and frame copying, etc. Based on the timing information of each
frame of a video, in the proposed scheme we first generate the unique watermarks
(representing the timing information) in real time and then embed the auto
generated watermark into 8 x 8 discrete cosine transform blocks of corresponding
frames of a video. While extracting the watermarks from each frame of the
watermarked video, any alterations in the sequence of the retrieved watermarks
are the indication of the temporal attacks (frame deletion, etc.) in the
copyrighted videos. Experiments have been conducted on self captured videos to
analyze the performance of the proposed watermarking model. Performance of the
proposed model has been measured in terms of the recall rate, i.e. ability of
the proposed watermarking model to correctly detect the attacks in videos,
quality of the watermarked videos and extracted watermarks using the quality
metrics, Peak Signal to Noise Ratio (PSNR). Experimental results show the 100%
recall rate in detection of the temporal attacks, if videos are manipulated in
temporal domain only. Achieved recall rate is in between the range 86% and 91%,
if videos are manipulated in spatio-temporal domain and it is 93%, if videos are
spatially manipulated. |
Keywords: |
Watermarking, Spatio-temporal attacks, Frame deletion, Frame swapping, Frame
Copying |
Source: |
Journal of Theoretical and Applied Information Technology
30th April 2018 -- Vol. 96. No. 8 -- 2018 |
Full
Text |
|
|
|