|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Informtion Technology
December 2019 | Vol. 97
No.01 |
Title: |
A ROBUST FRAMEWORK TO DETECT MOVING VEHICLES IN DIFFERENT ROAD CONDITIONS IN
INDIA |
Author: |
Dr.S.APPAVU ALIAS BALAMURUGAN, BALAJI GANESH RAJAGOPAL, Dr.KUMAR PARASURAM |
Abstract: |
Traffic situation in India is a quite complex in nature when compared to the
traffic models in other nations. It is very essential to model the traffic
nature in Indian roadways, both rural and urban roads. Indian road conditions
are predominantly occupies different classes of roads viz. single, double,
multi-way, cross junctions etc. This research article addresses the different
nature of Indian roads with an insight to model the traffic situations in
different weather conditions also. The proposed system tries to solve the
problem of counting and classifying the vehicles in Indian road conditions. The
system uses color image based foreground moving object detection by preserving
the color and model of the moving vehicles. The color image based background
subtraction technique is supported by cascaded linear regression. The system
also uses HoG for contour creation and extraction followed by morphological
dilation to connect the missing pixels in the vehicle object. The framework uses
adaptive Support Vector Machines to train and model the different classes of
vehicles. It has been found that the proposed framework shows an accuracy of 92%
in varying levels of traffic density, Illumination conditions. |
Keywords: |
Vehicle detection, Vehicle counting, Low quality video, Color image based
background model, MoG, HoG, SVM Classifier |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
THE ANALYSIS OF TEXTURAL IMAGES ON THE BASIS OF ORTHOGONAL TRANSFORMATIONS |
Author: |
GULZIRA B. ABDIKERIMOVA, FEODOR A. MURZIN, ALEKSEY L. BYCHKOV, XINYU WEI, ELENA
I. RYABCHIKOVA, TALGATBEK AYAZBAYEV |
Abstract: |
The aim of the conducted research is development and search of analysis
algorithms of textural images. The software products, which allow analyzing
successfully textures in details, can be used in different fields of science and
the industry. First of all, it is chemistry and materials science. It is
possible to analyze materials of organic origin, cuts of metals and minerals,
ceramics, etc. Another field of research, where we can effectively apply these
methods, is the diagnosis of internal pathologies of human, including malignant,
according to the images received by means of the thermal imager. In this study
we are talking about application of spectral decomposition on various
orthonormalized bases of images, which were received by the translucent
electronic microscopy. The program is implemented in the Matlab environment,
which allows spectral transformations of six types: 1) cosine, 2) Hadamard of
the order, 3) Hadamard of the order prime number, i.e. based on Legendre's
symbol, 4) Haar, 5) slant, 6) Dobeshi-4. Various experiments were made. The
algorithms, which were studied in this research, have allowed us to allocate
effectively on the analyzed images some fields, which can be characterized by
different degrees of structure orderliness. To say more precisely, chemists are
interested in the “disorder” areas of structure of materials, for example,
during studying the ultrastructure of plant cell walls. This research was made
for the Institute for Chemistry of Solids and Mechanochemistry of the Siberian
Branch of the Russian Academy of Sciences. The main attention was paid to the
development of software tools for the analysis of the above microphotographs. It
is supposed that received characteristics for different images - textural signs,
as well as various spectral coefficients can be further correlated with values,
which characterize the physical and chemical properties of the analyzed
material: reactivity, porosity, diffusion coefficient, and so on. For
correlation, it will be possible to use algorithms for machine learning, for
example, based on the neurocomputer approach. |
Keywords: |
Image Processing, Textural Images, Orthogonal Transformations, Microphotography
Analysis, Electronic Microscopy, Herbal Raw Material. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
HYBRID ARTIFICIAL BEE COLONY ALGORITHM WITH MULTI-USING OF SIMULATED ANNEALING
ALGORITHM AND ITS APPLICATION IN ATTACKING OF STREAM CIPHER SYSTEMS |
Author: |
MAYTHAM ALABBAS, ABDULKAREEM H. ABDULKAREEM |
Abstract: |
A new hybrid evolution algorithm (ABC-SA), i.e. artificial bee colony algorithm
(ABC) with multi simulated annealing (SA) using, is presented. In ABS-SA
procedure, the ABC provides a global search and the SA algorithm provides local
search. SA processes are used to improve the original ABC algorithm into two
different manners: (i) repair the initial food sources of ABC, which is
generally carried out randomly, in order to look for promising areas; and (ii)
selecting a candidate food source because SA can escape from local optimum point
by accepting worse solutions at a particular probability in the neighbor
searching period. The ABC-SA algorithm has been applied to break a number of
linear and nonlinear stream cipher systems, which is one of the hard electronic
cipher systems because of high security and difficulty in breaking it. The
current findings are encouraging. Comparison of the results indicated that in
most cases the ABC-SA algorithm outperforms the original ABC algorithm. |
Keywords: |
Artificial Bee Colony Algorithm, Simulated Annealing, Hybrid algorithm, Stream
Cipher System |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
HIDDEN ENCRYPTED TEXT BASED ON SECRETE MAP EQUATION AND BIOINFORMATICS
TECHNIQUES |
Author: |
ALAA KADHIM F., RASHA SUBHI ALI |
Abstract: |
The speedy development in informationtechnology desires the secure
transmissionof confidential information that gets an excellent deal ofattention.
Therefore; it's necessary to use effective methods to reinforce information
security. Steganography is one in all leading technologies getting utilized
around the world for along time. Biotechnological methods can be used for
cryptography to improve security of data. Steganography is the act of hiding
messages inside an image. Combining these two methods is a topic of high
relevance since secure communication is inevitable for mankind. This research
presents an analysis of steganography, by using LeastSignificantBit (LSB), DNA
computing and creating a secret map for hiding data. The DNA computing was used
to encrypt secret data, LSB was utilized to add the encrypted data into
leastsignificantbits of the cover and the secret map was utilized to specify the
location of hiding data. The same equation must be used by the sender and the
receiver to create the secret map and the creation for this map depends on the
shared key. |
Keywords: |
Least Significant Bit (LSB), DNA Computing, Secret Map, Steganography. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
BTS ALGORITHM: AN ENERGY EFFICIENT MOBILITY MANAGEMENT IN MOBILE CLOUD COMPUTING
SYSTEM FOR 5G HETEROGENEOUS NETWORKS |
Author: |
L. PALLAVI, A. JAGAN, B. THIRUMALA RAO |
Abstract: |
As of late, cell phones are turning into the essential stages for each client
who dependably meander around and get to the distributed computing applications.
Mobile Cloud Computing (MCC) consolidates the both portable and distributed
computing, which gives ideal administrations to the versatile clients. In
cutting edge versatile conditions, for the most part because of the immense
number of portable clients in conjunction with the little cell measure and their
convenient information's, the impact of portability on the system execution is
reinforced. In this paper, we propose an energy efficient mobility management in
mobile cloud computing (E2M2MC2) system for 5G heterogeneous networks. The
proposed E2M2MC2 system use back track searching (BTS) algorithm for congestion
prediction and selection of optimal routes to manage user mobility. The
simulation results shows that the proposed E2M2MC2 system helps in minimizing
delay, packet loss rate and energy consumption in a heterogeneous network. |
Keywords: |
Mobile Cloud Computing, Mobility Management, Heterogeneous Network, Best Route,
Energy Efficient, Back Track Search, distributed follow me cloud controller |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
LOW LIGHTNESS ENHANCEMENT USING NONLINEAR FILTER BASED ON POWER FUNCTION |
Author: |
NABEEL M. MIRZA, HANA H. KAREEM, HAZIM G. DAWAY |
Abstract: |
In fact, optical imaging systems produce images that demand to enhance low
contrast, poor illumination, and other reasons. Thus, it is essential that those
images pass through an improvement stage before testing them by specialists in
many different applicative fields. Therefore, this research aims at improving
the low lightness by a Nonlinear Filter Power Function (NFPF) algorithm. NFPF is
applied to enhance the illumination of color images; it consists of three steps
sequential: intensity enhancement, contrast enhancement, and color restoration
of RGB channels. The interest of the proposed enhancement method has been
evaluated depending on three criteria, namely: entropy, Normalize Mean Squared
Error for Hue (NMSEH) and Normalize Mean Squared Error for Saturation (NMSES).
The suggested algorithm (NFPF) was compared with four previous algorithms such
as a Parallel Nonlinear Adaptive Enhancement (PNAE), New Nonlinear Adaptive
Enhancement (NNAE), Multi-Scale Retinex with Color Restoration (MSRCR), and
Histogram Equalization (HE). Qualitative results show that the proposed
algorithm (NFPF) has outperformed other algorithms accordance to subjective and
objective assessment. |
Keywords: |
Image Enhancement, Adaptation Power Function, Histogram Equalization, Intensity
Enhancement. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
INTELLECTUAL TECHNOLOGIES AND DECISION SUPPORT SYSTEMS FOR THE CONTROL OF THE
ECONOMIC AND FINANCIAL PROCESSES |
Author: |
BIDIUK P.I., PROSIANKINA-ZHAROVA T.I., TERENTIEEV O.M., LAKHNO V.A., O.V. ZHMUD |
Abstract: |
A computer based decision support system is proposed the basic tasks of which
are adaptive model constructing and forecasting of various types of processes
that are developing in socio-economic systems under the influence of fundamental
structural changes. The complexity and urgency of the solvable problem is the
need to provide acceptable quality forecasts of financial and economic
indicators for short data samples, when the usage of retrospective data is
impossible or significantly limited. The DSS development is based on the
system analysis principles, i.e. the possibility for taking into consideration
of some stochastic and information uncertainties, forming alternatives for
models and forecasts, and tracking of the computing procedures correctness
during all stages of data processing. A modular architecture is implemented that
provides a possibility for the further enhancement and modification of the
system functional possibilities with new forecasting and parameter estimation
techniques. In addition, the proposed system, thanks to the modular
architecture, can be improved by using the software of different vendors without
any additional structural changes. A high quality of the final result is
achieved thanks to appropriate tracking of the computing procedures at all
stages of data processing during computational experiments: preliminary data
processing, model constructing, and forecasts estimation. The tracking is
performed with appropriate sets of statistical quality parameters. Example is
given for estimation of financial risk in insurance sphere and the electricity
consumption in terms of energy saving. The examples solved show that the system
developed has good perspectives for the practical use. It is supposed that the
system will be universal and find its applications as an extra tool for support
of decision making when developing the strategies for companies and enterprises
of various types. |
Keywords: |
Mathematical Model, System Analysis Principles, Adaptive Forecasting, Decision
Support System, Risk Estimation |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
APPLICATION OF SUPPORT VECTOR REGRESSION FOR JAKARTA STOCK COMPOSITE INDEX
PREDICTION WITH FEATURE SELECTION USING LAPLACIAN SCORE |
Author: |
ZUHERMAN RUSTAM, KHADIJAH TAKBIRADZANI |
Abstract: |
Researchers and investors have been searching for accurate model to predict the
stock value. An accurate model prediction could gain profits for investors.
According to Indonesia Stock Exchange, stock is becoming one of the most popular
financial instrument in Indonesia. Investors take the smaller sample called
index that represent the whole because it would be too complicated to record
every single security that trades in the country. There are many stock indices
in the world, one of them, is Jakarta Composite Index (JKSE). One of the
benefits of following the stock indices value is to reduce the loss in
investment. Thus, this paper is focused in supervised learning method to solve
regression problem, Support Vector Machines for Regression (SVR). There are
fourteen technical indicators calculated in this paper. Laplacian score will be
calculated for each fourteen technical indicators. Laplacian score is calculated
to mirror the locality preserving power. Support Vector Machines for Regression
(SVR) with feature selection using Laplacian Score is the proposed methodology
with Jakarta Compostie Index (JKSE) are considered as input data. The best model
is the prediction model with thirteen features and 30% training data which has
value of Normalized Mean Squared Error (NMSE) is 1.30691E-07 |
Keywords: |
Laplacian Score; Support Vector Machines for Regression (SVR); Jakarta Composite
Index (JKSE); Stock Price Trend Prediction. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
LANDSAT 8 SATELLITE IMAGERY ANALYSIS FOR RICE PRODUCTION ESTIMATES (CASE STUDY :
BOJONEGORO REGENCYS) |
Author: |
BANGUN MULJO SUKOJO , SALWA NABILAH , CEMPAKA ANANGGADIPA SWASTYASTU |
Abstract: |
Bojonegoro as the mainstay of rice producers in the province of East Java, have
a mission to realize the dream become a national food basket. In 2012, Bulog's
Bojonegoro Bulog to be the highest regional subdivisions throughout Indonesia.
Seeing this potential, it is necessary to attempt to monitor the stability of
agricultural production on a regular basis. By integrating the technology of
remote sensing using Landsat satellite imagery 8 to identify a growth phase and
forecasting models Autoregressive Integrated Moving Average (ARIMA) to predict
the productivity of rice, are expected to provide a solution and ease of
repeated and continuous monitoring with wide area coverage. Identify the growth
phase carried out in 9 phases. Of the linear regression between growth stage
rice plants with vegetation index values are used, the value of the coefficient
of determination (R2) of 0.7229 for NDVI algorithms and algorithms MSAVI
amounted to 0,879. Used reflectance values of wave band SWIR2 (1.57μm-1.65μm) to
help distinguish each growth phase of the identification algorithm MSAVI where
to phase 3, 4, 5 has a reflectance SWIR2 above 0.15, while the phases 7, 8, 9
has reflectance SWIR2 under 0.15. Forecasting process rice productivity obtained
seasonal ARIMA (1,0,0) 3. So that it can be seen Forecast Figures (ARAM) rice
productivity for subround III in 2013 amounted to 66.21 quintal per hectare.
Results highest estimate of 169,595.385 tons for tillering phase (15 weeks ahead
of harvest) and amounted 72246.878 tons for seedling phase (13-14 harvest next
week). So it can be seen that when the study was conducted, Bojonegoro located
in the growing season. |
Keywords: |
ARIMA, Phase Grown Rice, Landsat 8, Rice Productivity |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
A NEW PROPOSAL OF COMPRESSION METHOD FOR ENHANCING SHAMIR’S SECRET SHARING USING
GAUSSIAN ELIMINATION BASED ON HYBRID TRANSFORM CODING (IWT-DCT) |
Author: |
SALAH S. AL-RAWI, AHMED S. FASIH, AHMED T. SADIQ |
Abstract: |
Based on the dictionary coding technique, a new lossless compression method
(CBDM) is presented to compress the color image in sufficient manner without
missing any information and give a good CR. Then we use this proposed method in
the sharing scheme (which presented and explained in reference [3]) to enhancing
the performance of the system by combining it with other methods in compression
phase into the sharing system, which suggested to share a secret image into
multiple shadow pictures utilizing a method for solving a system of linear
equation by Gaussian Elimination and scheme of Shamir’s threshold. This sharing
system gives a shadow image size for everyone user to be as smaller as possible
[1/4.6*(v/k)] of the secret image (where v=2,3,.., according to k value; the
minimum number of qualified shares to reconstruct the secret), and any number of
shares less than k uncovers any data about the secret image. This technique is
secure for an image sharing with excellent execution time and gives fantastic
(PSNR) value rate [larger than 34 dB] as shown in result table using DPCM that
keep an image quality good however much as could be expected. |
Keywords: |
Secret Image Sharing (SIS), Visual Secret (VS), RLC, DPCM, Gaussian Elimination
(GE). |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
A REVIEW ON TOOLS AND TECHNIQUES FOR FAMILY TREE DATA VISUALIZATION |
Author: |
SITI FATIMAH BOKHARE, WAN MOHD NAZMEE WAN ZAINON |
Abstract: |
Visualization is an important and helpful ways to support the exploration of
large data sets. The leading benefit of visualization is that it does not only
provide graphical representation of data but also allows changing of form,
omitting what is not required and browsing deeper to get further details. This
paper reviews some of the previous research related to family tree (or sometimes
known as genealogy) data visualization. It focuses on existing techniques and
applications that are currently available to address family tree visualization
issues. The content of this paper is divided into several sections such as
visualization usability, family tree visualization, graph theory of kinship
network and graph visualization. Visualization gives opportunity to approach
huge network type of data and makes it easily comprehensible. In order to gain
the full benefit of family tree data, a proper understanding about the current
visualization tool or techniques that is used to represent these type of data in
a fully interactive environment will be highly beneficial. |
Keywords: |
Data Visualization, Family Tree Visualization, Theory of Kinship Network, Social
Network Visualization |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
IS / IT STRATEGY PLANNING IN PT GIFTCARD INDONESIA WITH WARD & PEPPARD FRAMEWORK |
Author: |
KEVIN BERSON S TURNIP, AHMAD NURUL FAJAR |
Abstract: |
This Study aims to analyze strategic plan for IT in the business GIFTCARD
INDONESIA, which can provide guidance in managing IT resources to meet the needs
of all business processes, supporting the development of business and improve
efficiency. By using the IT Strategic Planning framework, with the input of
external and internal business and IT environment. This study also use Critical
Success Factor to measure current portfolio application and create future
portfolio application. IT Strategic Plan that resulted from this research,
defines the structure of the IT organization and the IT infrastructure to
support the new development of IT services, which can be implemented in
supporting the advancement of the business. |
Keywords: |
Ward and Peppard, IS Strategy, IT strategy, IS/IT Management Strategy, PT
Giftcard Indonesia |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
LIGHTWEIGHT SECURE SCHEME FOR IOT-CLOUD CONVERGENCE BASED ON ELLIPTIC CURVE |
Author: |
AMRANI AYOUB, RAFALIA NAJAT, ABOUCHABAKA JAAFAR |
Abstract: |
The internet of things appears as a solution to connect people around the world.
Its utility lies in the ability to connect objects and exchange information
anywhere and everywhere. Many objects and services in different fields will be
created, such as smart homes, e health, transport and logistics. The evolution
of IoT, increases the number of connected object that generates a huge number of
data. However, with the low capacity of storage and processing of these objects,
there is a requirement to connect these objects to a large pool of resource like
Cloud computing. The convergence between IoT and Cloud, will bring many services
that will be of great benefit to humanity. However, this convergence will not
see the day unless the communication between devices and the Cloud is secure.
Most of the secure scheme proposed, that we will quote in the following
sections, either have a weakness on their scheme, or are based on Hypertext
Transfer Protocol (HTTP) which consumes bandwidth and which will exhaust the
resources of the devices. Publish / Subscribe is a messaging pattern where
publishers publishes messages to subscribers. The use of protocols based on
pub/sub like Message Queuing Telemetry Transport (MQTT) is very essential when
response time, lower battery, bandwidth and throughput usage are on the first
place for future solutions. In this paper, a secure Elliptic Curve Cryptography
(ECC) protocol using Publish / Subscribe lightweight protocol has been proposed
for creating a secure tunnel between IoT devices and Cloud Computing, and that
can allow a very fast communication also it's a light protocol that will not
exhaust the resources of the IoT object. In fact, we use the AVISPA tool for a
formal verification of our proposed protocol. |
Keywords: |
Security; Cloud Computing; Elliptic Curve Cryptography; Internet of Things;
MQTT. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
MEASURING INFORMATION SECURITY AND CYBERSECURITY ON PRIVATE CLOUD COMPUTING |
Author: |
WENDY, WANG GUNAWAN |
Abstract: |
Information security is an essential topic that contributes the success of
business operation nowadays. The urgency of applying effective information
security can be seen in all business and non-profit entities. The article takes
the case of university XYZ that uses private cloud computing as essential tools
to support its business processes. The article examines the effective way of
measuring the level of information security and CyberSecurity performance that
focuses on private cloud use with its recommendations. The article applies the
ISO 27001:2013 framework by involving all clauses in Annex A ISO 27001:2013 and
COBIT5 for CyberSecurity, section Applying to CyberSecurity. Annex A ISO
27001:2013 and COBIT5 for CyberSecurity is used to measure the information
security and CyberSecurity performance, respectively. The article uses a survey
method to the employees in the IT division at University XYZ. The article
examines the maturity level gap between current and expected results and
provides necessary recommendation to improve current situation. The outcome of
the article is expected to provide as a reference for information security
application in higher education institutions. |
Keywords: |
Information Security, CyberSecurity, Private Cloud Computing, ISO 27001, COBIT
5. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
AN EMPIRICAL EXAMINATION OF CHARACTERISTICS OF MOBILE PAYMENT USERS IN INDONESIA |
Author: |
GUNAWAN WANG , NADIA MIRANDA PUTRI , ARIO CHRISTIANTO , DANNY HUTAMA W |
Abstract: |
The current use of mobile devices is a necessity for almost all people,
especially in Indonesia. A total of more than 100 million mobile phones have
been used by the Indonesian people and approximately 150 million cellular cards
have been registered in Indonesia. This is an opportunity for entrepreneurs in
the technology field to take advantage of this business opportunity to create
applications that use mobile devices such as banking applications. In Indonesia,
banking applications have existed since 2007 and continue to grow until now with
almost the same features. After testing the questionnaire, it was found that
innovativeness, reachability, compatibility, convinience affect perceived
usefulness and perceived ease of use towards the intention of use |
Keywords: |
Mobile Payment, System Characteristics, Individual Differences, Mobile Payment
Users |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
MODIFIED K-MEANS CLUSTERING MODEL IN MULTI STORE DELIVERY SERVICE |
Author: |
PURBA DARU KUSUMA |
Abstract: |
There are several delivery service problems in companies that have multiple
stores in one city. These problems occur especially for companies that offer
products that these products must be delivered to the customers’ location by
using their own delivery service. For several companies, they distribute their
stock in a single main warehouse and in their stores. In the other side, their
delivery service fleet is also distributed in their main warehouse and in every
store. This condition triggers inefficiency in stock and the delivery fleet. In
this work, we propose the centralized shared delivery service model. As a
centralized model, the delivery service is handled by the central management so
that coordination in delivery process among vehicles can be more efficient. As a
shared system, the vehicle is not dedicated for single store only so that the
vehicle can deliver products that come from more than one store in a single
trip. In warehouse management, we use single warehouse concept so that all
purchased products from all stores will be delivered from the main warehouse. In
this work, we propose modified k-means clustering model in managing the delivery
process. By using clustering mechanism, each vehicle will deliver products that
their destination location is near to each other. In this work, we propose two
variants of the k-means clustering model. In the first variant, we combine the
k-means clustering method with the round robin method. In the second variant, we
combine the k-means clustering method with sequential vehicle creation method.
There are research findings after we have done tests. The increasing of the city
size makes all observed variables increase. This condition occurs in all models.
The increasing of the maximum delivery distance does not affect the total
delivery distance but makes the number of vehicles decrease and in the other
side makes the delivery distance per vehicle increase. The increasing of the
number of stores does not affect the total delivery distance. In the first
model, the increasing of the number of stores makes the number of vehicles
increase and the delivery distance per vehicle decrease. In the other models,
the increasing of the number of stores does not affect the number of vehicles
and the delivery distance per vehicle. The increasing of the number of
destinations makes all observed variables increase. |
Keywords: |
Delivery Service, K-Means Clustering, Round Robin, Single Warehouse Multi Store. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
TRACKING VEHICLES IN URBAN SMART CITY BASED XILINX PLATFORM |
Author: |
INAAM RIKAN HASSAN, MOHAMMED ABDULRAHEEM FADHEL |
Abstract: |
Computer vision becomes one of the significant smart city applications due to
the unbelievable growth in electronics, informatics, and communication fields.
Since the smart city is directed by smart self-governing systems, A lot of
algorithms have been released for achieving smart city requirements. These
algorithms include methods for detecting text, faces, vehicles and moving
objects. Then, by comparing their output with the ground-truth, the performance
of these algorithms can be measured. This paper focusing on following (detect
and track) the moving vehicles. Two different object detection algorithms have
been tested, namely temporal difference algorithm and fixed background algorithm
for a video of (120 x 160) pixels frame-size. The designed system was
implemented based on FPGA board (Xilinx-ISE 14.6 XC3S700A), while the simulation
was built by employing MATLAB. To stay away from the limitation of the FPGA
board size, the Verilog code was invoked by utilizing the MATLAB platform. |
Keywords: |
Urban Smart City, Temporal Difference Algorithm, Fixed Background Algorithm,
Xilinx |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
ARRHYTHMIA DETECTION BASED ON COMBINATION OF FREEMAN CHAIN CODE AND FIRST ORDER
TEXTURE FEATURES |
Author: |
ZAMEN F. JABR, RANA H. HUSSAIN, SHAYMAA R. SALEH |
Abstract: |
This paper presents a novel method of detection and classification an Arrhythmia
based on ECG chart using image processing techniques and neural network as
classifier tool .The method consist of three major stage firstly preprocessing
to prepare the ECG chart image, secondly features extraction stage represent by
freeman chain code and first order features which are arranged in vector consist
of 14 input each one hold one feature value, finally stage this vector of
features entered to BPNN classifier to classify an Arrhythmia type. The system
applied on dataset consists of 90 ECG chart images. Two different ratios of
train-ing/testing groups which are (30% to 70%,50% to 50%) are applied to the
classifiers. The higher system's accuracy in first ratios was100% for training
group and 90.5% for testing group while higher system's accu-racy in second
ratio was 100% for training group and 97.8% for testing group with time 31.6
second. The system achieved using Matlab. |
Keywords: |
ECG chart, Arrhythmia, Freeman chain code, First order features, Artificial
neural network. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
ESTIMOTE-BASED LOCATION AWARENESS ON MOBILE DEVICES FOR VISUALLY IMPAIRED |
Author: |
GIVA ANDRIANA MUTIARA, GITA INDAH HAPSARI, PERIYADI, AGUS PRATONDO |
Abstract: |
The limitations vision that possessed by the visually impaired in interacting
with their environment, causing them to have difficulties in doing traveling.
But, along with the development of smartphone technology, the visually impaired
people began using smartphones to help them engage in any activities. Estimote
beacons are a small device that broadcast a Bluetooth signal that can be
captured by a smartphone. This research contributions to provide the information
to the visually impaired person in order to have easy use compatible with the
smartphone since the Estimote beacons are used as a location awareness device to
give the information about the surrounding environment. The systems were
configured and programmed using android studio for indoor and outdoor locations.
Based on the indoor result testing, it can be stated that the implementation of
the Estimote beacon as a location awareness in indoor area, must be focused on
the installation of the Estimote beacons. The installation must be set smoothly
and the broadcast signal should not be overlap. The outdoor result testing
indicates that the Estimote beacon signal is stable to receive on the smartphone
at a distance 0 – 31.64 meters, begin unstable at a distance of 38.61 meter and
become undetected at a distance of 79.79 meters. |
Keywords: |
Estimote Beacons, Visually Impaired, Mobile Devices, Location Awareness,
Bluetooth |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
A ZONE TIME BASED VEHICULAR AD-HOC CLOUD NETWORK SERVICES MANAGEMENT SYSTEM |
Author: |
FIRAS M. KHALAF, FOAD SALEM MUBAREK |
Abstract: |
Vehicular Ad-hoc Networks (VANET) is a network infrastructure that provides the
communications among vehicles and have many Characteristics including reducing
congestion, minimizing accidents, reducing fuel consumptions etc. The concept of
cloud computing basically relies on using of the idle vehicles resources and
provision it for other vehicles either for free, or a user only pay for services
used (e.g. memory, processing time and bandwidth etc.). Exploiting the added
benefits of Cloud Computing and merging it with VANET is an advance step that
requires a special designs and solutions to accommodate VANETs characteristics
with the cloud concept requirements. The main problem for vehicular cloud
networks is high mobility and difficult predictability in urban area, making it
difficult to implement in connection and data processing, because continuous
interruptions in communication lead to loss data. Therefore, in this paper, we
proposed a system to manage some vehicular cloud network services to ensure all
these services completed without dropped or disconnected during execution time.
We adopt the concept of dynamic vehicles, which will satisfy the needs of the
users. Dynamic vehicle is evaluated with respect to spending time within the
zone, this time is calculated in several ways and in most cases the best
vehicles participating in the service are selected depending on zone time. The
proposed system is compared with the normal system. For accurate comparison, two
important metrics are selected, i.e. throughput and packet delivery ratio to
evaluate the two systems. The results proved that our system is more reliable
and efficient than the normal system in different scenarios. |
Keywords: |
NS2, VANET, VCC, VCC services, Zone time. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
A SYSTEMATIC REVIEW ON DISTRIBUTED DATABASES SYSTEMS AND THEIR TECHNIQUES |
Author: |
KATEMBO KITUTA EZÉCHIEL, SHRI KANT, RUCHI AGARWAL |
Abstract: |
Distributed Databases Systems (DDBS) are a set of logically networked computer
databases, managed by different sites and appearing to the user as a single
database. This paper proposes a systematic review on distributed databases
systems based on respectively three distribution strategies: Data fragmentation,
Data allocation and Data replication. Some problems encountered when designing
and using these strategies have been pointing out. Data fragmentation involves
join optimization problem since when a query has to combine more than one
fragment stored on different sites. This produces the high time response.
Heuristic approaches have been examined to solve this problem as it is known as
a NP-Hard problem. Data Allocation is also another particular problem which
involves finding the optimal distribution of fragments to Sites. This has
already been proved to be a NP-complete Problem. The review of some heuristics
methods as solutions has been conducted. Finally, Data replication, with its
famous synchronization algorithm, which is the unique strategy to manage
exchange of data between databases in DDBS, has been studied. Thus, following
problems have retained our attention: serialization of update transactions,
reconciliation of updates, update of unavailable replicas in Eager or
synchronous replication, sites autonomy and the independence of synchronization
algorithm. Therefore, this has been our motivation to propose an effective
approach for synchronization of distributed databases over a decentralized
Peer-to-Peer (P2P) architecture. |
Keywords: |
Distributed Database, Data Fragmentation, Data Allocation, Data Replication,
Data Synchronization, Peer-to-Peer (P2P) architecture. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
A REUSABLE BALINESE CALENDAR ENGINE |
Author: |
I MADE DWI MARTADI PUTRA, I MADE SUKARSA, DWI PUTRA GITHA, I WAYAN KANDI WIJAYA |
Abstract: |
Most of the Balinese digital calendar development begins with creating an
engine, which becomes an inefficient development process. In this study, a
reusable engine of the Balinese calendar was designed. This study used DSRM
methodology to identify problems and produce an engine as the solution. The
engine was a combination of Python and PLSQL, which makes it flexible to be
customized and embedded. The engine has several algorithms to calculate Balinese
calendars attributes (wuku, dewa, wewaran from ekawara to dasawara, ingkel,
jejepan, lintang, watek, urip or neptu, ekajala rsi, zodiak, pengalantaka, sasih
and year of Saka Calendar, full moon or new moon) and adjusted with the Saka and
Pawukon calendar system. The engine consists of a web service that served as
data parser and a database to store the attributes. Results of the experiment
showed that the engine was able to generate appropriate Balinese calendar
attributes of one day up to one-month or one-year Gregorian calendar, compared
to the other existing Balinese digital calendar. |
Keywords: |
Balinese Calendar, Engine, Python, Pawukon, Saka |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
EDM PREPROCESSING AND HYBRID FEATURE SELECTION FOR IMPROVING CLASSIFICATION
ACCURACY |
Author: |
SAJA TAHA AHMED, PROF. DR. RAFAH SHIHAB AL-HAMDANI, DR. MUAYAD SADIK CROOCK |
Abstract: |
Educational Data Mining (EMD) is in charge of discovering useful information
from educational datasets. In recent years, the data is mounting rapidly due to
the ease access to the websites of e-learning intakes extraordinary enthusiasm
from different colleges and instructive foundation. High dimensionality,
irrelevant, redundant and noisy dataset can affect the knowledge discovery
during the training phase in a bad way as well as degrading machine learning
performance accuracy. All these factors often rise demand for dataset
preparation, analysis, and feature selection. The fundamental aim of research is
to enhance the precision of classification by information preprocessing and
expel the unessential information without discarding any vital data by means of
feature selection.This paper proposes EDM dataset preprocessing, and hybrid
feature selection method by combining filter and wrappers techniques. In the
filter-based feature selection, the statistical analysis is based on the Pearson
correlation and information gain. In the wrapper method, the accuracy of the
feature subset is tested using a neural network as a baseline algorithm. The
obtained results show an enhancement in performance accuracy toward selecting
minimum feature subset with high predictive power over using all features. |
Keywords: |
Educational Data Mining, Hybrid Feature Selection, Neural Network, Data
Preprocessing, Accuracy. |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
DEVELOPED CRIME LOCATION PREDICTION USING LATENT MARKOV MODEL |
Author: |
REEM RAZZAQ ABDUL HUSSEIN, Dr.MUAYAD SADIK CROOCK, Dr SALIH MAHDI AL- QARAAWI |
Abstract: |
Latent models, called hidden Markov models (HMMs), are types of algorithms that
have been designed to detect crime activities by obtaining a sequence of
observations from hidden values. The main contribution of these types of models
is the fusion of coupled parameters with two types of HMM algorithms. The first
algorithm is the Viterbi algorithm, which is commonly used to find the most
probable path, and the accuracy of this algorithm is equal to 80%. The second
algorithm is the Baum–Welch algorithm, which has been used to produce robust and
accurate models. The modeling results normally focus on evaluating relative mean
square errors in log likelihoods, transition matrices, and emission matrices for
comparison of modeling performance based on different tolerance values. Previous
reports have shown that the modified Baum–Welch algorithm can achieve good
results for decreasing tolerance values. The goal of this Work is to generate a
compact model that deals with ternary parameters rather than binary parameters
by determining the sequential relation of past crime types and locations.
Geographic locations can improve the HMM visualization in MATLAB. Moreover,
crime levels and their most probable locations are predicted. The obtained
results prove the goal of this work. |
Keywords: |
Vine Copula, Hidden Markov Models, Viterbi Algorithm, Baum Welch Algorithm,
Measurement Errors |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
WIRELESS SENSOR NETWORK FOR ILLEGAL LOGGING APPLICATION: A SYSTEMATIC LITERATURE
REVIEW |
Author: |
GIVA ANDRIANA MUTIARA, NANNA SURYANA, OTHMAN BIN MOHD |
Abstract: |
Wireless Sensor Networks (WSN) is a technology available for outdoor area
application. The characteristic of illegal logging application is suitable to
apply WSN-based application. Because the illegal logging application is
implemented in a wide range area, supervised the environment forest, and
consists of hundreds of sensor nodes. This paper aims to review and summarize as
systematically the contribution of WSN Technology in illegal logging area
research as a long-range network application especially in detection and
identification method, timber tracking methods, data exchange, and transmission
method. A Systematic Literature Review (SLR) were outlined in this paper as a
standard methodology of predefined research strategy to solve the problems by
tracing the previous research. By defining the Research Question (RQ) to
guidelines the SLR process and inserting the search string in the database
reputation journal, the previous research can be configured. There are 42
previous studies applied WSN to used it in the illegal logging application. The
result stated that WSN has biggest contributions since 33% researcher using WSN
to tracking application, 41% use the WSN as a data exchange in their system, and
48% used WSN as data transmission between sensor nodes. This paper is expected
to give a contribution to the researcher who wants to build the system to tackle
illegal logging since the illegal logging has been hot issues in the world. |
Keywords: |
Wireless Sensor Network, Illegal Logging, Long-Range WSN, Data Exchange |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
AN EXPERT SYSTEM FOR THE DESIGN AND CLASSIFICATION OF ISLAMIC GEOMETRIC PATTERNS
USING COMPUTER GRAPHICS |
Author: |
AHMAD M. ALJAMALI, MOHAMED FAKIR |
Abstract: |
Islamic geometric patterns (IGP) have often presented an enduring historic
reverence to those who have strived to present a sensible classification of
these structures. They are complex, beautiful structures, which combine elements
of art, with elements of mathematics, especially relating to geometric patterns.
This article proposes an innovative approach to classify and design IGP using
computer aided technologies.
The researcher surveyed many existing
methodologies regarding the classification of IGP like 7-frieze patterns,
17-wallpaper patterns theories and design approaches based on principles of
classical gridding systems. The proposed methodology suggests a system which can
design classification of a pattern (collection of unit patterns) and
classification of a design (a collection of grids and geometric attributes). The
Classification of a pattern consists of repeating the unit base pattern by
isometric transformations (translation, mirroring, rotation and glide
reflections) to generate a pattern that can be classified as 7-frieze patterns
or the 17-wallpaper patterns. Classification of a design involves the
normalization of the grids and geometries. In this paper, the researcher also
presented an argument that those pattern theories are purely base models. The
researcher has been successful in developing a new method of classification
rightly validated by geometric and scientific analysis. The researcher succeeded
in developing software that draws the grids of any IGP star/rosette design and
displays its classification instantly.
So, the approach can be
considered as a measurable method of classification for any given Islamic
geometric design. The software is enabled to detect and classify the IGP
star/rosette sub-motif grid from its gridding system of classification which
will allow the user to explore and design the sub-motif pattern, motif pattern,
unit pattern and finally the pattern in x-y direction. |
Keywords: |
IGP, Pattern Theories, Star/Rosette Classification and Design |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
A NEW HUMAN FACE AUTHENTICATION TECHNIQUE BASED ON MEDIAN-ORIENTED PARTICLE
SWARM OPTIMIZATION AND SUPPORT VECTOR MACHINE |
Author: |
HAIDAR ABDUL WAHAB HABEEB, HASANAIN ALI HUSSEIN , MOHAMMED HASAN ABDULAMEER |
Abstract: |
One of the main complications in face recognition applications, it is
non-linearity. Support vector machine is one of the most significant
classification techniques in last a few years which can determine the global
finest solutions in many complicated problems with minor number of training
samples. However, selecting the ideal parameters for SVM is a major challenge
especially when SVM used in face recognition applications. Numerous
methodologies are utilized to manage this issue, for example, PSO, OPSO, AAPSO
and AOPSO. Nevertheless, there is a room of upgrades still exists respects this
sort of enhancement process. Recently, an enhanced version of PSO has been
introduced called Median-oriented PSO (MPSO) with a few favorable benefits:
simple to execute, insensitive to variable dimension, and no requirement for any
calculation particular parameters. In this study, a new face recognition
technique based on a combination of Median-oriented particle swarm optimization
and support vector machine is proposed. The proposed scheme is called (MPSO-SVM)
and we introduced it as a face recognition technique. In MPSO-SVM, MPSO is
utilized to discover the optimal parameters of SVM. Two human face datasets:
SCface dataset and CASIAV5 face dataset are used as a part of the
experimentation to assess the proposed MPSO-SVM in recognizing the human faces.
The proposed technique is compared with PSO-SVM, OPSO-SVM and AAPSO-SVM and the
results showed that the proposed MPSO-SVM has higher face recognition accuracy
than the other approaches. |
Keywords: |
Face recognition, SVM, PSO, Optimization, MPSO |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
Title: |
FRACTAL METHOD FOR NON-METAMORPHIC ANIMATION USING ITERATED FUNCTION SYSTEM
ALGORITHM |
Author: |
DEWI ROSMALA, TEDJO DARMANTO, DELFIAN PUTRA CALIBRA |
Abstract: |
In this research, the Fractal method is implemented for Non-Metamorphic
animation using Iterated Function System Algorithm. This study aims to find out
how implementation process of animation created by a fractal method with IFS
algorithm. The method used in this design is the drawing and animating stage.
The fractal method is used at the stage of drawing; therefore reading and
calculation of the input of data values of the IFS codes are done. The
coordinate points generated from the IFS code consisting of the dimensional
coefficient relative to the frame and the values of the points so that the
affine coefficient is obtained through calculating the IFS algorithm matrix and
forming a fractal object. In the animating stage, the object that has been
obtained from the drawing stage is processed by non-metamorphic animation
process through calculating the number of locations and the duration between the
points of the object location so that the object is seen moving from the point
of the initial location to the point of the final location. Based on the results
of the fractal method for testing IFS, it can be applied to the animation using
an object of fractal which the best results requires a sufficient iteration
value of 10000 times to form a full fractal object and the iteration process
does not last long, and as well as the off set value search testing performed on
various iteration tests, having an offset value average by 0.11735%. |
Keywords: |
Affine Coefficient, Drawing, Fractal, IFS Code, Non-Metamorphic Animation |
Source: |
Journal of Theoretical and Applied Information Technology
15th January 2019 -- Vol. 97. No. 01 -- 2019 |
Full
Text |
|
|
|