|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
October 2016 | Vol. 92 No.2 |
Title: |
STUDIES ON IMPROVING TEXTURE SEGMENTATION PERFORMANCE USING GENERALIZED GAUSSIAN
MIXTURE MODEL INTEGRATING DCT AND LBP |
Author: |
K. NAVEEN KUMAR, K.SRINIVASA RAO, Y.SRINIVAS, CH. SATYANARAYANA |
Abstract: |
This paper addresses the performance evaluation of the texture segmentation
integrating DCT with LBP. In this method, the whole image is converted in to
local binary pattern domain. The LBP image is then divided into different non
overlapping blocks. From each block, the DCT coefficients are selected in a
zig-zag pattern for each block. Assuming the feature vectors follow a
multivariate generalized Gaussian mixture model, the model parameters are
estimated using EM algorithm. The initialisation of the model parameters is
carried using moment method of estimation and using Hierarchical clustering
algorithm. The texture segmentation algorithm is developed under Bayesian frame
with component maximum likelihood. The performance of the proposed algorithm is
evaluated using performance measures such as GCE, PRI and VOI with randomly
selected images from Brodatz database. It is observed that this algorithm
outperforms existing texture segmentation algorithms with respect to performance
measures. |
Keywords: |
Texture Segmentation, Multivariate Generalized Gaussian Mixture Model,
Performance Measures, Local Binary Patterns, DCT Coefficients. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
SEAT BELT DESIGN BY GRAPHIC APPROACH |
Author: |
FRIH ABDERRAHIM, CHALH ZAKARIA, MRABTI MOSTAFA, OUAHI MOHAMED |
Abstract: |
An observer can reconstruct or estimate in real time the current state of a real
system using available measures, without prior knowledge of the initial
conditions, to have complete and accurate information on system status. In
addition, it allows estimating the unmeasurable states of a system, also
replacing expensive or difficult to maintain sensors. For example the slip
sensor.... For this aim, we will conceive full-order Luenberger observer on a
linear system using a graphical approach to model it and that will be easy to
handle thereafter, the causal properties and structural analysis of the bond
graph tool are implemented for the synthesis of the observer. As a result, the
calculation of the gains of the Luenberger observer is based on graphical
procedures. Also to give a comparison between the measured values and estimate
to demonstrate the importance and efficiency of the proposed approach.
Simulation tests on a design of the seat belt field shows the performance of the
observer Luenberger linear graph developed in order to measure and estimate the
state unmeasurable. |
Keywords: |
Bond Graph, Luenberger Observer, Belt Field, Performance, Estimate. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
AN EFFICIENT WEB USAGE MINING ALGORITHM BASED ON LOG FILE DATA |
Author: |
TAWFIQ A. AL-ASDI, AHMED J. OBAID |
Abstract: |
Information on Internet and specially on website environment is increasing
rapidly day by day and become very huge, this information play an important role
for discovering various knowledge in the Web. Web Usage Mining one of the Web
Mining algorithm categories that concern with discover and analysis useful
information regard to link prediction, users' navigation, customers' behavior,
site reorganization, web personalization and frequent access patterns from large
web data that logs by Web server side and stored in standard text log file
format called log file or Web usage data, this data can also be collected from
an organization's database such as NASA. Web Usage Mining is a process of
applying Data mining techniques and application to analyze and discover
interesting knowledge from the Web. There are several existing research works on
log file mining, some concern with web site structure, traversal pattern mining,
association rule mining, Web page classification, and general statistics such as
amount of time spent on a page. In this paper we will focus on mining the
different segments content of Web log data entries in order to discover the
hidden information and interesting browsing contents from it, then applying
clustering algorithm to find similar groups of Web sites that have common
browsing contents. |
Keywords: |
Web Mining, Web Usage Mining, Log File Analysis, Clustering, K-means, System
Monitoring. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
PROVIDING A SOLUTION TO IMPROVE PRE-COPY METHOD FOR MIGRATING VIRTUAL MACHINES
IN CLOUD INFRASTRUCTURE |
Author: |
PARVIN AHMADI DOVAL AMIRI, SHAYAN ZAMANI RAD, FARAMARZ SAFI ISFAHANI |
Abstract: |
Cloud computing can be defined as a new computing model which suggests solutions
for providing information technology services analogous to utility services such
as power electricity, telephone. Thanks to the virtualization technology, most
of Cloud datacenters use this technology for performance improvement. This
technology transforms a physical server to several virtual machines. With regard
to this property, virtual machines can be transferred from one place to another
place that is called migration. Pre-Copy is one of the main methods in migration
techniques. An important problem in this method is that when the memory pages
are changed (dirty pages) faster than sending pages to a destination, the
process of migration will take a long time that is the case in transferring
virtual machines. In this paper, a solution is proposed that sets up the speed
of virtual machines in terms of CPU frequency and makes a balance between
senders and receivers in pre-copy algorithm. The results show that the proposed
method has better performance compared to the pre-copy method and decreases the
total migration time around 25% and the total amount of transferred data around
47%. |
Keywords: |
Cloud Computing, Migration, Virtualization, Virtual Machine |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
DISCRETE RISK MODELS OF THE PROCESS OF VIRAL EPIDEMICS DEVELOPMENT IN HOMOGENOUS
INFORMATION AND TELECOMMUNICATION NETWORKS |
Author: |
ELENA NIKOLAEVNA PONOMARENKO, VERA NIKOLAYEVNA KOSTROVA, RUSLAN KALANDAROVICH
BABADZHANOV, YURIY NIKOLAEVICH GUZEV, VLADIMIR SERGEYEVICH ZARUBIN |
Abstract: |
The object of this research is homogeneous networks, i.e., networks with small
fluctuations of the vertices degree. In other words, in such homogeneous
structures assume that k≅〈k〉, where parenthesis mean averaging over the degree
of distribution. The structures also have a very specific application in
practice. Typically, such networks require stringent communication within any
corporation. This network organization is particularly relevant in information
and telecommunication networks (ITN) of critically important objects. In this
paper, the models such as SI, SIS, SEIS, SIR, SEIR, MSEIR are distinguished,
when a malware attack is performed by network virus, which uses vulnerability in
the work of network services of the operating system for its spread. The model
synthesis is performed under the conditions that ITN contacts can be represented
by a complete graph, and the epidemic occurs in a closed ITN. The features of
the process of malware program transmission from one computer to another, as
well as the internal features of malware program execution on the computer, are
ignored. For each model, the analytical expressions of epidemic resistance were
obtained. The prospects of using the proposed models in the process of network
warfare were distinguished. |
Keywords: |
Epidemic, Risk, Epidemic Resistance, Mathematical Expectation, Mode, Standard
Deviation (SD), Information And Telecommunication Network (ITN). |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
ON THE IDENTIFICATION OF THE STRUCTURAL PATTERN OF TERMS OCCURRENCE IN A
DOCUMENT USING BAYESIAN NETWORK |
Author: |
SOEHARDJOEPRI, NUR IRIAWAN, BRODJOL SUTIJO SU, IRHAMAH |
Abstract: |
The pattern of text documents is strongly influenced by the advent of the first
term in composing term structure of each sentence. When two documents have the
same pattern, then the second and the following terms tend to be same. This
paper would create a special tool for detecting the similarity of structural
pattern of two text documents. Latent Semantic Analysis(LSA) couples with
Bayesian Network (BN) are employed as the main engine to build the algorithms.
The work of these approaches is demonstrated to detect the similarities of the
appearance of the term in the sentence in any text documents. |
Keywords: |
Text Pattern Document, Term, Latent Semantic Analysis, Bayesian Network. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
A NOVEL TOP-K INFREQUENT MINING TECHNIQUE ON COMPLEX DISTRIBUTED MARKET
DATABASES |
Author: |
SUJATHA KAMEPALLI, RAJA SEKHARA RAO KURRA, SUNDARA KRISHNA .Y.K |
Abstract: |
Infrequent association rule mining is one of the essential tasks in data mining
research to find rare items on complex data set. Also, most of the traditional
models focus on finding negative association rules based on different
association measures. However, finding relational infrequent patterns from the
large number of candidate sets is still an open problem in the distributed
market analysis. Traditional infrequent mining models are mainly depending on
quantitative attributes, limited data size and Boolean datasets. In any
distributed environment, as the size and complexity of the market data
increases, it is difficult to find the sparsity issue from the positive
association rules. In this proposed approach a novel infrequent association
mining algorithm was implemented to find the topmost relational infrequent
patterns from the complex market dataset. Experimental outcomes prove that the
proposed model extracts high quality, infrequent patterns compared to
conventional infrequent rule mining techniques. |
Keywords: |
Complex Data, Infrequent Association Rules, Data Sparsity, Quantitative
Association Rules, Rank Correlation. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
COURSE SCHEDULING USING MULTI-AGENT EXPLORATION METHOD |
Author: |
JAMALUDIN HAKIM, RETANTYO WARDOYO, SRI HARTATI, AHMAD ASHARI |
Abstract: |
The issue of course scheduling is a frequent topic, involving the use of new
techniques with promising results. This research used exploration method on the
Schedule Media with the agents providing a flexible schedule based on the
resulted combination and a stable process. The constraints include lecturers’
teaching preferences and students’ different choices of courses in the Semester
Credit System; which would be the problem being addressed in this study. Using
multi-agents, the exploration method would be able to process lecturers’
constraints and varying courses chosen by students from varying semesters.
Agents’ responses to requests during the scheduling process would determine the
success of the scheduling. |
Keywords: |
Scheduling, Exploration Method, Constraints, Multi-Agent, Schedule Media |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
NEURAL NETWORK OPTIMIZATION USING SHUFFLEDFROG ALGORITHM FOR SOFTWARE DEFECT
PREDICTION |
Author: |
REDDI. KIRAN KUMAR, S.V.ACHUTA RAO |
Abstract: |
Software Defect Prediction (SDP) focuses on the detection of system modules such
as files, methods, classes, components and so on which could potentially consist
of a great amount of errors. SDP models refer to those that attempt to
anticipate possible defects through test data. A relation is present among
software metrics and the error disposition of the software. To resolve issues of
classification, for the past many years, Neural Networks (NN) have been in use.
The efficacy of such networks rely on the pattern of hidden layers as well as in
the computation of the weights which link various nodes. Structural optimization
is performed in order to increase the quality of the network frameworks, in two
separate cases: The first is the typically utilized approximation error for the
present data, and the second is the capacity of the network to absorb various
issues of a general class of issues in a rapid manner along with excellent
precision. The notion of Back Propagation (BP) is quite elementary; the result
of neural networks is tested against the desired outcome. Genetic algorithms
(GA) are a type of search algorithms built, based on the idea of natural
evolution. A neural network using Shuffled Frog Algorithm for improving SDP is
proposed. |
Keywords: |
Software Defect Prediction (SDP), Neural Network (NN), Back Propagation (BP),
Genetic
Algorithm (GA) and Shuffled Frog Leaping Algorithm (SFLA) |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
SYNTHETIC RANGE IMAGE SIMULATION OF TERRESTRIAL LIDAR SCANNER |
Author: |
THINAL RAJ, FAZIDA HANIM HASHIM |
Abstract: |
In recent years, the usage of Light Detection and Ranging (LiDAR) scanners has
become predominant in the fields of robotics, navigation and remote sensing for
numerous applications. Due to the high selling cost of LiDAR scanners, the
research communities have developed LiDAR simulators mostly for Airborne Laser
Scanning (ALS) application and some minimal works can be found for Terrestrial
Laser Scanning (TLS) application. While the LiDAR simulators are becoming more
sophisticated, there are still some applications area where the use of these
simulators are not convenient. This paper describes an alternative and simple
approach for developing a simulator for terrestrial LiDAR scanner application.
The aim of this research is to model a synthetic 2D LiDAR scanner to provide
range data based on object and sensor parameters input provided by the user. The
object models are derived explicitly from geometrical relationships between the
object and ray from scanner, thus avoiding the use of vectors. This approach
directly raster the range image in contrast to ray tracing. The GUI-based
simulator is developed in Processing language, thus it has an open source and
hardware independent environment, making it ideal for research applications. The
data obtained from this simulator can be used as ground truth for analyzing the
characteristics of LiDAR scanners or can even be used for localization
simulations. |
Keywords: |
Terrestrial LiDAR, 2D Laser Scanner, Simulator, Range Image, Synthetic, Range
Data |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
HAM: A HYBRID ALGORITHM PREDICTION BASED MODEL FOR ELECTRICITY DEMAND |
Author: |
WAHAB MUSA, SALMAWATY TANSA |
Abstract: |
Based on the rapid development of digital signal processing technology and
computers, computational intelligence (CI) becomes an object of study fields of
fundamental and applied research of interest to researchers recently. Research
that exploits a number of further processing techniques are the subject of CI
information technology, including artificial neural networks, genetic
algorithms, fuzzy logic, evolutionary computation, or a combination of these
techniques, known as hybrid technology. The purpose of this study is to develop
a prediction model based CI to improve the accuracy of prediction of the demand
for electricity. Method combination of several algorithms in search of optimum
value will be developed to overcome the premature convergence on the model
predictions. The data will be used to measure the performance of the hybrid
model is data electrical energy needs of Indonesia. Average prediction errors
will become a reference in selecting the right model for the planning of the
electrical energy needs of the next few years in Indonesia. The results showed:
1) performance computational intelligence-based prediction models that utilize
the capabilities of the hybrid algorithm is superior to the single
algorithm-based predictive models. 2) The accuracy of the prediction model based
hybrid algorithm (HAM) can reach 97%, exceeding the level of the previous
model's accuracy. |
Keywords: |
Computational Intelligence, Hybrid Algorithm, Prediction Model. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
UNDERSTANDING NETWORK CONGESTION EFFECTS ON PERFORMANCE - ARTICLES REVIEW |
Author: |
MOHAMED NJ., SHAHRIN SAHIB, NANNA SURYANA, BURAIRAH HUSSIN |
Abstract: |
Networking communications have become popular worldwide in human daily services.
Network Congestion (NC) happens whenever because nodes and links are overloaded.
Such situations affect the network expected performance and its services
quality. Congestion /NC occurs as a results of its subnets’ links overload,
which gradually (overtime) affects the network performance with an increase of
transmission delay, a slowdown of throughput as generally perceived by network’s
users. NC is considerable as the basis problem in network performance quality
acceptance; and most of its existing problem solutions are expected still
playing a great role in the future networks model, which will be running mostly
too many multimedia applications. However, various researches over past years
have initiated the study on the causes leading to congestion and, different
lessons can be learnt from NC situations analysis to understand its relationship
with the future network’s performance. This paper presents an analytical review
of NC occurrence causes and the fundamentals of the existing control
solutions/frameworks as available and studied from some former and recent
networks publications. A particular attention has been paid throughout this
study to found out how NC may still affect the future networks performance (i.e.
QoS in the world of multimedia networks). And the coverage/content of this paper
is expected to serve as a quick access to the knowledge essentials for
researchers on related subject as stated in this paper topic. |
Keywords: |
Communication, congestion, performance, service quality, QoS, network,
networking, WLAN/LAN, choke point, congestion, edge router, flow, traffic, data
rate, wireless. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
IMPLEMENTATION OF SENSORLESS CONTROL OF AN INDUCTION MOTOR ON FPGA USING XILINX
SYSTEM GENERATOR |
Author: |
SGHAIER NARJESS, TRABELSI RAMZI, MIMOUNI MED FAOUZI |
Abstract: |
In this paper, we will presented a deterministic observation approach ,
nonlinear applied to the Induction Motor: this is a sliding mode observer.
Indeed, this paper will serve to emphasize the importance of the order without
sensor to increase the profitability of our machine. Our sliding mode observer
will be applied to the field oriented vector control then to the sliding mode
control. The contribution of this paper is the design of sensorless control
using XSG order to be implanted on the FPGA. |
Keywords: |
Induction motor, FPGA, Sensorless Control, Sliding mode observer, XSG |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
AN ARTIFICIAL BEE COLONY BASED OPTIMIZATION FOR IMAGE SEGMENTATION IN MARKOVIAN
FRAMEWORK: APPLICATION TO NON DESTRUCTIVE TESTING IMAGES |
Author: |
MOHAMED BOU-IMAJJANE, MOHAMED SBIHI |
Abstract: |
Image segmentation is a fundamental task in image analysis process. In this
paper, we propose a segmentation model using MRF (Markov Random Fields) and a
global optimization method based on ABC (Artificial Bee Colony) algorithm. As a
Markovian algorithm, ICM (Iterated Conditional Modes) is a segmentation method
which takes into account the neighbouring labels of the pixel in calculating the
energy function that need to be minimized to obtain the best segmentation.
Though, in some cases, this segmentation method may converge to the first
encountered minimum during the minimization process. To face out this situation,
ABC is so used to improve the energy function optimization process since it
gives robust results especially in discrete multivariable optimization problems.
The contribution of this work is to propose MRF-ABC algorithm that consists of
introducing ABC to optimize Markovian energy function at the convergence point
obtained using ICM to adjust the belongingness of pixels in order to improve
image segmentation quality for X-ray Non Destructive Testing images. |
Keywords: |
Markov Random Fields, Artificial Bee Colony, Image Segmentation, Potts Energy
Function, Non Destructive Testing |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
ENHANCING THE AD-HOC MESH NETWORKS INFRASTRUCTURE IN RURAL AREAS: ADAPTIVE
APPROACH |
Author: |
IBRAHIM OBEIDAT, ALI ABU ABID, HUTAF NATOUREAH |
Abstract: |
Rural areas all over the world continue to be poorly covered and are not
considered as an achievable business case by telecommunication operators. This
is due to high implementation costs compared to the profit. The growth in
telecommunications, as well as new mobile technology has expanded the gap
between rural and urban areas in networks’ infrastructure. To solve this issue,
this research uses ad-hoc mesh networks, where each node has a transmission
range of one kilometer, which means it needs a third party to play the role of
relay if the distance is more than one kilometer. Findings of this research
suggest that if the area coverage is less than or equal to 5%, the number of
relay nodes reduces by 50%, whereas if the reduction in relay nodes is 8.5% when
the area coverage is more than 5%, then the second assumption is more precise
and effective. |
Keywords: |
Ad-Hoc Mesh Networks, Connectedness, Terranet, Adjacency Matrix, Adaptive
Threshold |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
FEATURE IDENTIFICATION OF AN INTERACTIVE MULTI-USER MEETING TABLETOP BASED ON
INTUITIVE GESTURE RECOGNITION |
Author: |
HALEEMA SADIA and ABDULLAH MOHD ZIN |
Abstract: |
Multi-touch technology has shown a rapid rise in popularity over the last few
years, being implemented in many devices from interactive walls to interactive
tables and from mobile phones to desktop monitors. It has provided users with an
extremely intuitive means of interaction with electronic devices through gesture
based self-sensing control. The advances in touch technology have led to an
increased presence of multi-touch interfaces in consumer products in recent
years. However, very little research has been done in developing interactive
multi touch multi user project management system. The interactive multi-touch
multi-user project management system based on tabletop surface helps users to
execute their important meetings in a more productive and less intrusive manner.
It also facilitates new ways to foster collaborative creation, permitting
several users to work simultaneously on a single screen. This paper describes
some of the features that are required for such an application. The feature
identification process is carried out through comparative analysis and
evaluation of scenario analysis. This paper also discusses some of the issues
and challenges of multi-touch technology. |
Keywords: |
Multi-user, Multi-touch Tabletop, Project management, Project planning |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
XML INTEGRITY CONSTRAINTS, WHAT’S NEXT? |
Author: |
MOHAMMED HAKAWATI, PUTEH SAAD, NASEER SABRI, YASMIN YACOB, R. B. AHMAD, M. S.
SALIM |
Abstract: |
Without any doubt, XML data model considered the most dominant document type
over the web with more than 60% of the total; nevertheless, their quality is not
as expected. Data cleaning is equipped to overcome database’s quality issues.
Integrity Constraint is a very important criterion for keeping data in a
consistent manner, almost all previous XML dependencies are introduced to
improve the schema and normalization, with a small effort toward improving data
instance. This paper summarizes the most important XML integrity constraint
notations and data cleaning approaches. In addition, to highlight the
shortcoming of these constraints and proved it is limitation for increasing data
quality. Finally, introduce the next generation of conditional integrity
constraints, which will be held mainly for data cleaning issues. |
Keywords: |
XML, Data Quality, Data Cleaning, Integrity Constraints. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
KLANG VALLY RAINFALL FORECASTING MODEL USING TIME SERIES DATA MINING TECHNIQUE |
Author: |
ZULAIHA ALI OTHMAN, NORAINI ISMAIL, ABDUL RAZAK HAMDAN, MAHMOUD AHMED SAMMOUR |
Abstract: |
Rainfall has influence the social and economic activities in particular area
such as agriculture, industry and domestic needs. Therefore, having an accurate
rainfall forecasting becomes demanding. Various statistical and data mining
techniques are used to obtain the accurate prediction of rainfall. Time series
data mining is a well-known used for forecast time series data. Therefore, the
objective of this study is to develop a distribution of rainfall pattern
forecasting model based on symbolic data representation using Piecewise
Aggregate Approximation (PAA) and Symbolic Aggregate approXimation (SAX). The
rainfall dataset were collected from three rain gauge station in Langat area
within 31 years. The development of the model consists of three phases: data
collection, data pre-processing, and model development. During data
pre-processing phase, the data were transform into an appropriate representation
using dimensional reduction technique known as Piecewise Aggregate Approximation
(PAA). Then the transformed data were discretized using Symbolic Aggregate
approXimation (SAX). Furthermore, clustering technique was used to determine the
label of class pattern during preparing unsupervised training data. Three type
of pattern are identified which is dry, normal and wet using three clustering
techniques: Agglomotive Hierarchical Clustering, K-Means Partitional Clustering
and Self-Organising Map. As a result, the best model has be able to forecast
better for the next 3 and 5 years using rule induction classification
techniques. |
Keywords: |
Time Series Data Mining, Clustering, Classification, Time Series Symbolic
Representation, Rainfall Forecasting |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
IMPROVED FAULT DETECTION IN WATER DESALINATION SYSTEMS USING MACHINE LEARNING
TECHNIQUES |
Author: |
MORCHED DERBALI, ANAS FATTOUH, HOUSSEM JERBI, MOHAMED NACEUR ABDELKRIM |
Abstract: |
In this paper, the authors have attempted to study the fault detection using the
machine learning technique for the water Membrane Distillation Systems (MDS).
Initially, an actual system with the MDS, applying nanotechnology was developed
which was based on actual measurements. Then, the errors occurring between the
outputs of the model (additionally, these outputs serve as MDS inputs) and
system outputs were classified for identifying the system faults. This type of
classification was carried out by using different approaches and the
classification results were further compared. It was noted that the
classification accuracy obtained by using the decision trees was the best as
compared to the other learning techniques like K-Nearest Neighbours, Neural
Networks, and the Support Vector Machines (SVM). |
Keywords: |
Learning Techniques, Water Membrane Distillation System, Desalination Systems,
Fault Detection, Detection Accuracy. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
WAVELET DECOMPOSITION LEVELS ANALYSIS FOR INDONESIA TRADITIONAL BATIK
CLASSIFICATION |
Author: |
FIKRI BUDIMAN, ADANG SUHENDRA, DEWI AGUSHINTA, AVINANTA TARIGAN |
Abstract: |
Indonesia traditional batik is a non-material cultural heritage which has
patterns that basically divided into batik ‘keraton/pedalaman’ and ‘pesisir’. An
appropriate content-based image recognition method is needed to recognize the
pattern of Indonesa traditional batik in a large image database. The results of
this research can be used to recognize batik ‘keraton’ and ‘pesisir’ based on
feature of wavelet energy and standard deviation, with 122 images as training
dataset and 120 images as test datasets.
Classification using binary non-linear support vector method, with feature
extraction of discrete wavelete Transform (DWT) which was tested for wavelet
types of Daubechies 1 - 5 with decomposition first level, and Daubechies 2 with
decomposition 1st to 5th level. The Best result is obtained by Daubechies 2
decomposition thirth level with an accuracy of 96.7%. The result is better than
the previous researches with the same datasets and classification method. The
previous researches conducted using feature extraction method with fractal
feature obtained an accuracy at 91.6%, and that which used GLCM with 20
parameters obtained an accuracy at 80%. |
Keywords: |
Image Retrieval, Traditional Batik, Decomposition Wavelet, Feature
Extraction,Classification. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
K-NEAREST NEIGHBOR BASED DBSCAN CLUSTERING ALGORITHM FOR IMAGE SEGMENTATION |
Author: |
SURESH KURUMALLA, P SRINIVASA RAO |
Abstract: |
Clustering is a primary and vital part in data mining. Density based clustering
approach is one of the important technique in data mining. The groups that are
designed depending on the density are flexible to understand and do not restrict
itself to the outlines of clusters. DBSCAN Algorithm is one of the density
grounded clustering approach which is employed in this paper. The author
addressed two drawbacks of DBSCAN algorithm i.e. determination of Epsilon value
and Minimum number of points and further proposed a novel efficient DBSCAN
algorithm as to overcome this drawback. This proposed approach is introduced
mainly for the applications on images as to segment the images very efficiently
depending on the clustering algorithm. The experimental results of the suggested
approached showed that the noise is highly reduced from the image and
segmentations of the images are also improved better compared to the existing
image segmentation approaches. |
Keywords: |
Data Mining, Clustering, Density Based Clustering, DBSCAN, K-Nearest Neighbor,
Image Segmentation. |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
Title: |
CLASSIFICATION OF VOIP AND NON-VOIP TRAFFIC USING MACHINE LEARNING APPROACHES |
Author: |
GHAZI AL-NAYMAT, MOUHAMMD AL-KASASSBEH, NOSAIBA ABU-SAMHADANH, SHERIF SAKR |
Abstract: |
Enhancing network services and security can be achieved by performing network
traffic classification identifying applications, which is one of the primary
components of network operations and management. The traditional transport-layer
and port-based classification approaches have some limitations in achieving
accurate identification. In this paper, a real test bed is used to collect
first-hand traffic dataset from five different VoIP and Non-VoIP applications
that are used by majority of Internet community, namely Skype, YouTube, Yahoo
Messenger, GTalk and PayPal. The collected data encompasses new features that
have never been used before. In addition, a classification step is performed
using off-the-shelf machine learning techniques, specifically Random Forest J48,
meta.AdaBoost (J48) and MultiLayer Perceptron to classify the traffic. Our
experimental results show that using the new features can dramatically improve
the true positive ratio by up to 98% and this is significant outcome towards
providing accurate traffic classification. |
Keywords: |
Traffic classification, Application identification, Machine Learning, VOIP and
Non-VOIP Application, CAPTCHA |
Source: |
Journal of Theoretical and Applied Information Technology
31st October 2016 -- Vol. 92. No. 2 -- 2016 |
Full
Text |
|
|
|