|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Information Technology
November 2017 | Vol. 95
No.22 |
Title: |
AN EMPIRICAL EVALUATION FOR THE INTRUSION DETECTION FEATURES BASED ON MACHINE
LEARNING AND FEATURE SELECTION METHODS |
Author: |
MOUHAMMD ALKASASSBEH |
Abstract: |
Despite the great developments in information technology, particularly the
Internet, computer networks, global information exchange, and its positive
impact in all areas of daily life, it has also contributed to the development of
penetration and intrusion which forms a high risk to the security of information
organizations, government agencies, and causes large economic losses. There are
many techniques designed for protection such as firewall and intrusion detection
systems (IDS). IDS is a set of software and/or hardware techniques used to
detect hacker's activities in computer systems. Two types of anomalies are used
in IDS to detect intrusive activities different from normal user behavior.
Misuse relies on the knowledge base that contains all known attack techniques
and intrusion is discovered through research in this knowledge base. AAnomaly
Detection, Attacks, Dos, SNMP, MIB, Classification, Feature Selection
rtificial intelligence techniques have been introduced to improve the
performance of these systems. The importance of IDS is to identify unauthorized
access attempting to compromise confidentiality, integrity or availability of
the computer network. This paper investigates the Intrusion Detection (ID)
problem using three machine learning algorithms namely, BayesNet algorithm,
Multi-Layer Perceptron (MLP), and Support Vector Machine (SVM). The algorithms
are applied on a real, Management Information Based (MIB) dataset that is
collected from real life environment. To enhance the detection process accuracy,
a set of feature selection approaches is used; Infogain (IG), ReleifF (RF), and
Genetic Search (GS). Our experiments show that the three feature selection
methods have enhanced the classification performance. GS with bayesNet, MLP and
SVM give high accuracy rates, more specifically the BayesNet with the GS
accuracy rate is 99.9%. |
Keywords: |
Anomaly Detection, Attacks, Dos, SNMP, MIB, Classification, Feature Selection |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
IMAGE STEGANOGRAPHY BASED ON ODD/EVEN PIXELS DISTRIBUTION SCHEME AND TWO
PARAMETERS RANDOM FUNCTION |
Author: |
MOHAMMED MAHDI HASHIM, MOHD SHAFRY MOHD RAHIM |
Abstract: |
Presently, the evolution massive of the internet gives more attention, play
important role in the field of communication, and transfer messages. Nowadays,
hiding sensitive or secret information inside trusted media such image without
being noticed by the intruder is more needed because of the privacy cases, this
method called steganography. In this paper, a method of hiding secret data in an
image based on odd/even pixels distribution scheme and two parameters random
function have introduced. The objective of this study is to increase the
imperceptibility of proposed method with a high payload capacity of secret
message. Two main process are used in the proposed method, which are embedding
process and extracting process. Huffman coding technique is utilize to compress
the secret message before embedding process. The security and capacity of the
proposed method will increase after preparation secret message. The main
objective of proposed scheme is to increase image quality (PSNR) in stego image.
Two main things make the method effective: first, checking matching of secret
bits with LSB and mapping to determine even and odd word during embedding, and
second, segmenting the secret message to track and map every bit in stego image.
Experimental results of the proposed method can achieve a high imperceptibility
and robustness was emphasized. |
Keywords: |
Image Steganography, Least Significant Bit (LSB), Odd/Even pixel distribution,
Peak Signal to Noise Ratio (PSNR), Information Security |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
IT PRE-ENTREPRENEUR'S FOUNDING DECISION MAKING AND PSYCHOLOGICAL MECHANISM |
Author: |
KYUNGYOUNG OHK, JAEWON HONG |
Abstract: |
This study explored IT pre-entrepreneur’s decision making process between
psychological perception of self and behavioral intentions toward founding. For
this purpose, we investigated the influence of career development, job attitude,
and job self-esteem on founding intention. The results of the analysis are
summarized as follows. First, job attitude and job self-esteem are positively
influenced by career development. This suggests that career development can
increase job attitude and job self-esteem. Second, job attitude which influenced
by career development have positively effects on founding intentions. But, job
self-esteem does not influence on intentions of founding. Third, career
development positively influences on job self-esteem, but job self-esteem has no
effect on intention of founding. Therefore, it is necessary to make efforts to
improve job attitude rather than job self-esteem in order to activate start-up
business of IT pre-founder. This study has important implications for analyzing
the cognitive mechanism in the decision-making process of IT pre-entrepreneur.
So this study is expected to help the development of entrepreneurial decision
making information system for IT start-up. |
Keywords: |
IT pre-entrepreneur, Decision making, Psychological mechanism, Founding
information system, IT Star-up |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
THE RELATIONSHIP BETWEEN CELLULAR PHONE ADDICTION AND SELF ESTEEM OF ELEMENTARY
SCHOOL STUDENTS IN HIGHLY MOBILE ENVIRONMENT |
Author: |
KWANG-OK LEE, HYUN-JU CHAE |
Abstract: |
The purpose of this study was to investigate the cellular phone addiction status
and the relationship between cellular phone addiction and self-esteem of
elementary school students in today’s highly mobile environment. The study
participants were 1,173 upper grade elementary school students in Chungnam
province. Data were collected by a self-administered questionnaire from November
5 to 26, 2012. Collected data were analyzed using the IBM SPSS 20.0 program with
descriptive statistics, Pearsons correlation coefficient, independent t-test,
one-way ANOVA and Scheffe test. The level of the cellular phone addiction in
upper grade elementary school students was generally low but high risk students
also existed. In addition, a negative correlation was observed between
self-esteem and cellular phone addiction in upper grade elementary school
students. These results suggest that various interventions to prevent and manage
the cellular phone addiction should be provided for elementary school students
in today’s highly mobile environment and interventions for high risk students of
cellular phone addiction should be provided urgently. In addition, inclusion of
interventions to increase the self esteem of upper grade elementary school
students in those interventions should be also needed. |
Keywords: |
Cellular Phone, Addiction, Self-Esteem, Elementary School, Mobile Environment |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
PROBABILISTIC MODEL OF ALLOCATION LAWS OF EXPERIMENTAL DATA IN INFORMATION
SYSTEMS |
Author: |
YURI YURIEVICH GROMOV, YURI VIKTOROVICH MININ, OLGA GENNADEVNA IVANOVA,
ALEKSANDR GEORGIEVICH DIVIN, HUDA LAFTA MAJEED |
Abstract: |
On the basis of the beta distributions of the 1st and 2nd kind were received
probabilistic models of distribution laws, which allow to approximate wider
class of distribution laws of experimental data, than the existing Pearsons
system of distributions. The method of identification parameters was developed
of the generalized beta distribution using power, exponential and logarithmic
moments. |
Keywords: |
Information System, Experimental Data, Probability, Distribution Laws,
Approximation. |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
DR-QPO: DISCRETE RANK BASED QUERY PATTERN OPTIMIZATION TOWARDS PARALLEL QUERY
PLANNING AND EXECUTION FOR DISTRIBUTE TRIPLE STORES |
Author: |
K.SHAILAJA, Dr. P.V. KUMAR, Dr. S.DURGA BHAVANI |
Abstract: |
This manuscript proposed and explored a novel strategy for query pattern
optimization towards parallel query planning and execution in Distributed RDF
environments. The critical objective of the proposal is to optimize the query
patterns from the query chains initiated to execute parallel in distributed RDF
environment, which is unique regard to the earlier contributions related to
parallel query planning and execution strategies found in contemporary
literature. All of these existing models aimed to notify the query patterns from
the given query chain, which are less significant to optimize the parallel
process of the query patterns that discovered from multiple query chains
submitted in parallel in distributed environment (such as cloud computing) to
query the distributed triple stores. In order to this, the Discrete Rank based
Query Pattern Optimization (DR-QPO) strategy is proposed. The DR-QPO optimizes
the query patterns from multiple query chains initiated in parallel. A novel
scale called Discrete Rank ConsistenceScore (DRDCS) defined, which uses the
order of other metrics query pattern occurrence count, search space utilization,
and access cost as input. The experiments conducted on the proposed model and
other benchmark models found in contemporary literature. The results obtained
from the experimental study evincing that the proposed model is significant and
robust to optimize the query patterns in order to execute distribute query
chains in parallel. The comparative analysis of the results obtained from DR-QPO
and other contemporary models performed using ANOVA standards like t-test,
Wilcoxon signed rank test. |
Keywords: |
RDF, Query Optimising, Parallel Planning, SPARQL, DRDCS, Access Cost Search
Space Utilization Distributed Query Science |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
A CERTAIN INVESTIGATION ON SECURE LOCALIZATION ROUTING PROTOCOL FOR WSN |
Author: |
N. A. NATRAJ, Dr. S. BHAVANI |
Abstract: |
Wireless Sensor Network (WSN) is an advancement of wireless network where nodes
are located with respect to static or dynamic. Secure Localization is a major
challenge in WSN where location of unknown nodes may not be identified. In
previous work, authors focused on secure localization but not balancing energy
consumption. In this proposed work, Secure Localization Routing Protocol (SLRP)
is developed to attain balancing between secure location integrity and energy
efficiency. This protocol contains three phases. In first phase, cluster member
selection and route formation is implemented to forward packet to next hop node
efficiently. In second phase, localization procedure is adopted based on
location hop distance value, residual energy of node for location discovery and
minimum cost function. In last phase, secure localization scheme is implemented
to secure location information about cluster members from attackers.
Localization procedure is implemented with confidentiality using effective
cryptography technique to protect messages from attackers and worms. The
extensive simulation results are performed over SLRP, RMSR, ECHERP and ENSOR in
terms of location integrity rate, location accuracy, location update rate,
control overhead, packet delivery ratio and packet delay. Proposed protocol SLRP
produces better results than existing schemes. |
Keywords: |
WSN, Cluster Member Selection, Route Establishment, Localization Procedure,
Secure Localization Scheme And Minimum Cost Etc |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
MICRO CONTROLLED AND DIRECTED ALARM SYSTEM FOR BLIND PERSON |
Author: |
Dr. RASHID A. FAYADH |
Abstract: |
Blindness is a lacking state of visual perception due to neurological or
physiological factors. Partial blindness represents the integration lack in the
optic nerve growth or the eye visual center. The total blindness is defined as
the full absence of light visual perception. The deaf and blind persons are
frequently suffering at mostly exercising of basic things in daily life. This
suffering makes lives almost at risk during traveling through obstacles, due to
necessary equipment lack in my country that provides them an assistance state to
avoid or cross over the risk. For this reason, the idea of research was come to
design and manufacturing ultrasonic sensor. The proposed work contains friendly
user, cheap, and simple alarm system that is implemented and designed support
the mobility of both visually impaired and blind people in a specific area. This
work includes wearable equipment that consists of cummerbund for helping the
blind person in the way navigation safely alone by avoiding encountered
obstacles whether mobile or fixed to prevent any possible hard accident. The
main system component is the ultrasonic wireless sensor which is used for
scanning the predetermined around area of the blind by emitting-reflecting
waves. The echo or reflected signals are received by Arduino microcontroller
which are coming from barrier objects. The issued commands are carried out by
the microcontroller to communicate the status of device back or given appliance
to the vibration device. In this project a sensor is used to detect around
obstacles within 150 cm of designed range. It can change the distance by simply
changing the variable resistance button to avoid the blind person through the
vibration when there is a risk. |
Keywords: |
Arduino Microcontroller, Blind Mobility, Blindness And Darkness, Emitting
Reflecting Waves, Ultrasonic Sensor |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
OTSUS SEGMENTATION: REVIEW, VISUALIZATION AND ANALYSIS IN CONTEXT OF AXIAL BRAIN
MR SLICES |
Author: |
HUMERA TARIQ, ABDUL MUQEET, AQIL BURNEY, MUHAMMAD AKHTER HAMID, HUMERA AZAM |
Abstract: |
Otsu s Method is a non parametric approach for image segmentation and is an
attractive alternative to Bayes decision rule. Use of Nelder Mead for Otsus
optimization has been used since long but cannot be seen in image segmentation
literature. We in this paper address this gap in a novel way and revive
classical literature of Otsu’s Image segmentation by experimenting it for voxel
based tissue classification which then follows volume measurement of MRI base
subjects. The other methods used to meet objective includes: spatial filtering,
skull stripping and binarization of brain MR slices. The “goodness” of
thresholds lies between 0.90<η^*<0.99 for every brain MR slice in the volume.
Significant difference was found (p<0.01) and (F >>1) amongst mean gray level of
tissues, mean tissue volume densities within slices of each subject and in
average volume tissue density of all Ten subjects. |
Keywords: |
MR Brain Images, Otsu’s Segmentation, Nelder-Mead Simplex Optimization, Image
Segmentation, Volume Measurement, Skull Stripping, Tissue Classification |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
PERFORMANCE ENHANCEMENT OF BLIND ALGORITHMS BASED ON MAXIMUM ZERO ERROR
PROBABILITY |
Author: |
NAMYONG KIM, KIHYEON KWON |
Abstract: |
In this paper, the error-Gaussian-kernelled input of the algorithm developed by
maximization of zero-error probability of constant modulus error (MZEP-CME) is
studied for developing a method to reduce the weight perturbation of the
MZEP-CME under impulsive noise. The proposed method is to normalize the input of
MZEP-CME with the norm of the error-Gaussian-kernelled input (EGKI) in order to
reduce weight perturbation. Then the denominator of the step size can make the
algorithm unstable when it has a very small value or wide fluctuations. To
prevent these incidents, a balanced power of EGKI between the current power and
the past one is employed. This normalization with balance power provides an
additional function for reducing further the weight perturbation in impulsive
noise environment. Simulation results show that the weight fluctuation after
convergence of the proposed algorithm is below half of that of the MZEP-CME.
Also compared with the MZEP-CME, the proposed approach lowers the steady state
MSE (mean squared error) by about 1 dB under impulsive noise. |
Keywords: |
Impulsive Noise, Maximization Of Zero-Error Probability, Constant Modulus,
Error-Gaussian-Kernelled Input, Weight Perturbation |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
RESOURCES OPTIMIZATION METHODOLOGY FOR HETEROGENEOUS COMPUTING SYSTEM |
Author: |
MAHENDRA VUCHA, DVS CHANDRA BABU, ARVIND RAJAWAT, KARTHIK R |
Abstract: |
Nowadays embedded systems are being equipped with one or more processing cores
to support parallel processing of applications and meets design goals at
optimized resource utilization. Optimization of resources utilization can be
achieved by estimating application requirements in terms of many aspects like
performance, resource usage, memory usage, energy consumption, cache performance
etc. Among these aspects, estimation of resources is an important to boost
execution speed of an application with optimal resources utilization. So, ever
increasing system and application complexities made the resources estimation is
quite necessary to optimize resources utilization for an application. This paper
addresses an optimized heterogeneous computing platform called Heterogeneous
Reconfigurable Computing System (HRCS) and also a resources profiling
methodology to estimate HRCS resource required for multifarious real life
application. The HRCS has been modeled on a single chip Virtex-5 FPGA device and
it has been equipped with multiple Reconfigurable Logic Units (RLUs) in
combination with a softcore processor as Processing Elements (PEs). |
Keywords: |
Reconfigurable Logic Unit, Heterogeneous Reconfigurable Computing System, System
on Chip, Design Space Exploration |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
METHOD AND ALGORITHM FOR DIAGNOSTIC OF EPILEPTIC EEG SIGNALS USING THE ADAPTIVE
ORTHOGONAL TRANSFORM |
Author: |
ASMA HAFDI, ABDENBI ABENAOU, AHMED TOUMANARI |
Abstract: |
Electroencephalography (EEG) is one of the most used techniques for evaluating
the functional status of the brain. It is essential for diseases’ diagnosis such
as epilepsy. This pathology results from a cerebral dysfunction. The diagnosis
of this pathology consists of detecting the appearance of paroxysmal activities
in the EEG signals. The diagnostic of Epilepsy in EEG plays a crucial role in
Computer Aided Diagnosis system (CAD). In this article, we suggest an approach
based on the orthogonal adaptive transformation theory which makes it possible
to extract the informative features of the EEG signals. The size of the vectors
of the informative features obtained by this method is very short. This will
allow to improve the quality of signals analysis and to increase their certainty
of diagnosis |
Keywords: |
Adaptive Orthogonal Transformation, Basis Functions, Extraction of the
Informative Features |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
MULTISET CONTROLLED GRAMMARS |
Author: |
SALBIAH ASHAARI, SHERZOD TURAEV, M. IZZUDDIN M. TAMRIN |
Abstract: |
This study focusses on defining a new variant of regulated grammars called
multiset controlled grammars as well as investigating their computational power.
In general, a multiset controlled grammar is a grammar equipped with an
arithmetic expression over multisets terminals where to every production in the
grammar a multiset is assigned, which represents the number of the occurrences
of terminals on the right-hand side of the production. Then a derivation in the
grammar is said to be successful if only if its multiset value satisfies a
certain relational condition. In the study, we have found that control by
multisets is powerful tool and yet a simple method in regulation of generative
processes in grammars. We have shown that multiset controlled grammars are at
least as powerful as additive valence grammars, and they are at most powerful as
matrix grammars. |
Keywords: |
Multisets, Context-free Grammars, Regulated Grammars, Generative Capacity. |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
UNDERSTANDING VISITOR BEHAVIOR ON SOCIAL MEDIA USAGE IN INDONESIAS MUSEUM |
Author: |
ARTA MORO SUNDJAJA, FORD LUMBAN GAOL, SRI BRAMANTORO ABDINAGORO, BAHTIAR S.
ABBAS |
Abstract: |
The purpose of this study was to understand the behavior of visitors that
visited the museum and the adoption of social media technologies in the museum
in Indonesia. The growing presence of social media allows the opportunity for
the branding and marketing of the non-profit organization with limited resources
but demand high impact. Museum industry in Indonesia needs to preserve and
communicate their culture to the public with limited financial and human
resources. Social media was proven to solve this problem. The data was collected
from a sample of 63 respondents using a questionnaire. The data were analyzed
using descriptive analysis. The results were most of the social media users are
in the productive age who have the minimum an undergraduate education, and most
of them live in Java. They are not visiting the museum frequently, and their
visiting purposes are to learn or study about Indonesia culture triggered by
their school, relaxing, personal desires, and visit by accident. They know
information about the museum from the search engine, offline, and social media.
Most of the museum in Indonesia give bad experience to their visitor, and only
some offer a positive experience. The limited social media users in Indonesia,
who already aware of museum presence in social media, utilize it to get
information about the museum (profile, exhibition, collection, and news),
socializing with other social media members, and showing their existence in
society. They suggest that museum management should update their social media
frequently. This study examines the characteristics of Facebook Fan Page Museum
user (age group, gender, residence location, and education level) and their
behavior in using Facebook Fan Page Museum. Since there is no research in this
field in Indonesia, the findings can be used by museum managers for managing
their social media in marketing and education for the community. |
Keywords: |
Online Visitor Behavior, Museum Social Media, Edutainment Marketing, Online
Visitor Expectation, Indonesia Museum. |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
A NOVEL SWARM INTELLIGENCE BASED ENERGY EFFICIENT PROTOCOL FOR WIRELESS SENSOR
NETWORKS |
Author: |
SUPREET KAUR, RAJIV MAHAJAN |
Abstract: |
Energy efficiency has recently turned out to be primary issue in wireless sensor
networks. Sensor networks are battery powered, therefore become dead after a
certain period of time. Thus, improving the data dissipation in energy efficient
way becomes more challenging problem, in order to improve the lifetime for
sensor devices. The clustering and tree based data aggregation for sensor
networks, can enhance the network lifetime of wireless sensor networks. Hybrid
Ant colony optimization (ACO) and particle swarm optimization (PSO) based
energy,, efficient clustering and tree based routing protocol is proposed.
Initially, clusters are formed on the basis of remaining energy, then, hybrid
ACOPSO based data aggregation will come in action to improve the inter-cluster
data aggregation further. Extensive analysis demonstrates that proposed protocol
considerably enhances network lifetime over other techniques. |
Keywords: |
Wireless Sensor Networks, Ant Colony Optimization, Energy Efficient, Particle
Swarm Optimization |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
EEG BASED USER IDENTIFICATION METHODS USING TWO SEPARATE SETS OF FEATURES BASED
ON DCT AND WAVELET |
Author: |
HEND A. HADI, DR. LOAY E. GEORGE |
Abstract: |
Brain activities may represented by EEG signals, which are set of measures using
electrodes along the scalp, they are more secret, sensitive, and hard to steal
and reproduce. They hold great potential to provide robust and secure biometric
system for user identification and verification. This paper aims to present a
comparison between our previous proposed feature set that based on Partitioned
Fourier spectra and some new features sets proposed in this work. They
established as simple, fast, and promising set of features for EEG-based
identification system. The first introduced feature set is based on the energy
distribution of DCT AC-components, and the second set is the statistical moments
for three types of wavelet transforms. Each set of features is tested using
normalized distance measures for matching stage. Each proposed method was
tested using the publicly available EEG CSU dataset which was collected from
seven healthy volunteers and the publicly available EEG Motor Movement/Imagery
dataset which is relatively large dataset was collected from 109 healthy
subjects. The attained identification results are encouraging with best
recognition rate is (100%) for all proposed methods and for both datasets. All
tested feature sets were extracted under the condition, which was adopted in our
previous work, that is "they should extracted from EEG data belong to single
task & signal channel". All achieved results are considered competitive when
compared with the results of other recently published works. The adopted
condition reduced the computational complexity and thus reduced the required
processing time. Also, the wavelets based methods hold computational complexity
less than DFT and DCT with recognition rates are more competitive to them. |
Keywords: |
Signal Processing, Wavelet Transforms, DCT, Energy Features, And Statistical
Moments |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
TRACKING COMMUNITY EVOLUTION IN SOCIAL NETWORKS |
Author: |
LOUBNA BOUJLALEB, ALI IDARROU, DRISS MAMMASS |
Abstract: |
Recently, social network analysis is gaining on importance and bringing several
challenges in the computer science discipline. Most social networks are dynamic
and evolve gradually and the communities in these dynamic networks usually have
changing members and could grow and shrink over time. The analysis of
communities and their evolution is a relevant research domain that attracts
researchers from a variety of fields; having suitable information and methods
for dynamic analysis, one may challenge to forecast the future of the
communities, and then conduct it appropriately in order to attain or modify this
predicted future according to precise requirements. This capability would be a
strong mechanism used by marketing, human resource managers, personnel
recruitment, etc. In this paper, we are analyzing the changes in the dynamic
network through tracking and examining the dynamic evolution of communities
within a sequence of snapshots. We start by describing some basic dynamic
features of social networks. Then, we propose a new technique called CED
(Community Evolution Detection) which was developed in order to detect community
evolution in the social network. The central elements of this technique are that
it greatly depends on key nodes and QuantityInsertion metric. It also focuses on
both efficiency and parameter free. We demonstrate the abilities and potential
of our approach by testing it in real datasets and compare it with well-known
algorithm with regard to complexity, accuracy and flexibility. |
Keywords: |
Community Evolution, Dynamic Network Analysis, Dynamic Social Network,
Evolutionary Analysis, Community Dynamics |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
NANOTECHNOLOGY THEORY USED FOR SIMULATION OF EMERGING BIG DATA SYSTEMS ON HIGH
PERFORMANCE COMPUTING: A CONCEPTUAL FRAMEWORK |
Author: |
NORMA ALIAS, MAI, 1HAFIZAH MUSA, VICTOR RIYADH SERGEY, NORHAFIZA HAMZAH, WALEED
MUGAHED AL-RAHMI |
Abstract: |
The Implications of big data analytics of current trends in nanotechnology
theory, model and simulation are becoming impressive issues. The potential
applications of nanotechnology in the industrial sector, identifying and
prioritizing research across the emerging technology are motivating to perform
the integration between the conceptual framework of nanotechnology and a big
data system development. This paper presents six variations to meet the contexts
of a conceptual framework for modeling the complex systems involve
nanotechnology theory, modeling, large scale numerical simulation in the real
world problem. Integrated mathematical modeling and large scale numerical
simulations are the tools to solve the complex systems. The conceptual framework
is a comprehensive concept in theory, ordinary differential equation (ODE) or
partial differential equation (PDE) modeling and simulation based on high
performance computing (HPC). The main objective is to improve the process of
huge computation of the big data modeling and to increase the performance
evaluation of parallel programming on HPC platform. The framework organizes the
idea and step to be considered for solving the integrated theory, mathematical
modeling with fast numerical simulation , specific parallel computing strategy ,
communication software and HPC hardware system which are applicable for solving
large scale nanotechnology applications. |
Keywords: |
Nanotechnology Theory, Big Data, High Performance Computing |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
IMPACT OF LEAN SOFTWARE DEVELOPMENT INTO AGILE PROCESS MODEL WITH INTEGRATION
TESTING PRIOR TO UNIT TESTING |
Author: |
SHAIK MOHAMMAD SHAHABUDDIN, DR.PRASANTH YALLA |
Abstract: |
The current academic thinking on integration testing prior to unit testing using
agile methodology shows that it is an innovative approach little understood and
practiced formally. However, this approach according to Brown et al. contributes
to economic governance, disciplined delivery and measure improvement for
achieving agility at scale in software industry. This has motivated us to
investigate and propose a conceptual model and make an empirical study in our
previous work. In this paper, we reinforce the study with a case study based
approach and quantify the real benefits of the new cultural shift in testing
known as integration testing prior to unit testing. In addition to this, we
studied the lean software development in terms of testing and integrated it with
the phenomenon of integration testing prior to unit testing. We identified many
aspects of lean principles. Nevertheless, we found mind mapping and
identification of infeasible test cases are two important aspects. They are
associated with lean principle like removal of waste to improve productivity
further in agile and lean software development environment. The empirical
results revealed that productivity is increased with the paradigm shift in
testing arena. The quantification of benefits in terms of productivity shows
significant performance differences between traditional approach and the leagile
(new term referring to lean and agile) approach in software testing. |
Keywords: |
Lean Software Development, Agile Process Model, Software Testing, Integration
Testing Prior To Unit Testing |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
MOTION ESTIMATION AND CONVOLUTIONAL CODING FOR VIDEO STREAMING OVER WIRELESS
CHANNELS |
Author: |
SALAH A. ALIESAWI, SALAH S. MUSTAFA , SAEED A. GATHBAN |
Abstract: |
In this paper, a source and channel coding scheme is proposed for lossy video
transmission over wireless channels. The source coding is used to reduce the
redundancy so that the bit streams/video can be stored or sent efficiently over
a network. The block-Matching (BM) with Quintet Search (QS) algorithm for motion
estimation (ME) uses as a source coding or video compression technique. In the
new QS algorithm, the number of searching points is reduced to five points
instead of eight points in standard algorithm. The channel/convolutional coding
(CC) adds useful redundancy to combat channel effects of the wireless
environments and improve the system performance. The results of compression
using different searching algorithms and different settings of the H.261 encoder
parameters are computed for different videos with 176×144 dimension to select
the proper settings that can be applied to the input video streams being
transmitted over wireless links. The results show that the proposed system can
produce a balance between the compression performance and preserving the video
quality in video conference applications. Further, the performance of coded
systems over additive white Gaussian Noise (AWGN) channels is also computed and
compared with the system without channel coding. |
Keywords: |
Block-Matching, Motion Estimation, Convolutional Coding, Video Transmission. |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
AN EFFICIENT ALGORITHM FOR DATA CLEANSING |
Author: |
Saleh Rehiel Alenazi, Kamsuriah Ahmad |
Abstract: |
The data that have been collected from different resources might be redundant
and duplicate. These data need to be cleaned in order for it to be used for
other processing. The data should undergo detection process for any occurrence
of duplication in the datasets. Two strategies are used to identify duplicates
which are windowing or blocking. The aims of this paper are to review, to
analyze and to compare algorithms in order to find the most efficient in terms
of better accuracy and less number of comparisons. A comparison was made with
the five most popular algorithms: DYSNI, PSNM, Dedup, InnWin and DCS++. Two
benchmark datasets were used for the experiment, which are Restaurant and Cora.
The results reveal that the DYNSI algorithm using both datasets gives high
accuracy with respect to the number of comparisons. It is hoped that the results
obtained from this study able to give the best review and comparison among the
existing algorithms in producing high quality data and serve as a guidance to
implement a better initiative for data storage system. |
Keywords: |
Data Cleansing, Record Deduplication Deduction Algorithm, Windowing-based,
Efficiency, Accuracy, Data Quality |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
SAGG: A NOVEL LINKED DATA VISUALIZATION APPROACH |
Author: |
NOEMI SCARPATO, GIANFRANCO ALESSIO |
Abstract: |
In this paper, we describe the Semi-automatic GUI Generator (SAGG), a
knowledge-based visualization system able to create GUI in a semi-automatically
way. Since the linked has been introduced their expressiveness made possible to
provide users with a lot of useful information, visualize this information is a
crucial issue in the realization of the semantic web principles. The objective
of our approach is the induction of tailored GUI able to show the considered
linked data in the better way. The key idea behind our approach is the
exploitation of existing web pages to deduct visualization patterns for linked
data. Our intent is to provide common users with a semi-automatic GUI
generator, this system is able to visualize linked data without the necessity
for the above-mentioned users to know semantic technologies for data
visualization. Following, we present an introduction to linked data
visualization systems, our solution to linked data visualization issue: the SAGG
system, its architecture, the implemented algorithms to realize it and its user
interaction mechanisms. |
Keywords: |
Linked Data Visualization, Semantic Web, Information Visualization, Knowledge
Visualization, OWL, SPARQL |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
HYBRID GENETIC VARIABLE NEIGHBORHOOD SEARCH BASED JOB SCHEDULING WITH
DUPLICATION FOR CLOUD DATA CENTERS |
Author: |
RACHHPAL SINGH, KARANJIT SINGH KAHLON, GURVINDER SINGH |
Abstract: |
Background/Objectives: Scheduling is one of the important way to provide high
availability of processors to cloud users. Majority of scheduling approaches are
NP-Hard. Therefore, meta-heuristics techniques are required to schedule the jobs
on virtual machines (VMs). Meta-heuristic techniques usually suffer from
inter-processor communication issues as well as premature convergence and global
optima. Methods: To handle these issues, hybrid scheduling technique was
proposed using Genetic Algorithm (GA) and Variable Neighborhood Search (VNS)
with Task Duplication (TD). Thus, proposed technique can reduce the
inter-processor scheduling overheads among high-end servers. Results: To
attain the objectives of the proposed approach, the cloud based model was
designed by considering well-known Fast Fourier Transformation (FFT) problem
using Directed Acyclic Graph (DAG). A simulation environment was designed to
implement the proposed technique. Extensive experiments have shown that the
proposed technique outperforms over available techniques regarding Makespan,
Speedup, and Efficiency. Conclusion: From a comparative analysis of existing and
scheduling techniques it has been found that the mean reduction in makespan is
7.07%. The comparative studies have demonstrated that the mean improvement of
proposed technique over other techniques concerning efficiency is 0.031%. |
Keywords: |
Cloud Environment, Task Duplication, Variable Neighborhood Search, Genetic
Algorithm, Directed Acyclic Graph |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
DETECTION AND CLASSIFICATION OF POWER TRANSFORMER FAULTS USING FFA BASED RNN
TECHNIQUE |
Author: |
P.LAKSHMI SUPRIYA, P.SUJATHA |
Abstract: |
In this paper, proposed an intelligent technique for diagnosing the internal
faults conditions in power transformers. The proposed intelligent technique is
the composites of wavelet transform and RNN based FFA optimization technique.
Initially, the normal signals are analyzed at the particular time instant. After
that, investigate any faults occurred or not in the power transformer with the
help of proposed technique. With the utilization of the proposed technique, the
current signals of the power transformer is monitored and detected. Initially,
the MWT is utilized to extract the features of the signal. In wavelet transform,
the feature approximation of the signal is depends on the decompose levels of
high and low frequency components. The extracted features are applied to the
input of the FFA. The FFA is selected the optimized training dataset for
training the RNN. After that the RNN testing process is evaluated the signal and
classified the fault signal type. The effectiveness of the proposed technique is
evaluated based on the statistical measures like accuracy, sensitivity and
specificity qualities. The proposed method is implemented in MATLAB/Simulink
platform and compared with the existing techniques. |
Keywords: |
Power Transformer, Fault Detection and Classification, Multi-Wavelet Transforms
(MWT), Recurrent Neural Network (RNN) and Firefly Algorithm (FFA) |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
THE IMPACT OF SOCIAL NETWORKS ON INDIVIDUAL’S BEHAVIORAL CHANGE IN KINGDOM OF
BAHRAIN |
Author: |
JAFLAH ALAMMARY |
Abstract: |
The growth in Social Networks in Arab countries played a crucial role in civil
mobilization, empowerment, shaping opinion and influencing change. The
popularity of Social Networks tools has continued to grow, however, extent to
what the communication via Social Networks can lead to social or individual
change is still debatable in these countries. The present study proposed and
tested a holistic model that extended the Theory of Planned Behavior - TPB and
redefined behavioral intention toward using technology to behavioral change as
an effect of using new technology by exploring the impact of Social Networks on
the individual behavioral change. Toward achieving the objectives of the current
research, a quantitative research method was adopted. Three hundred surveys were
distributed to residents from the different governorates in Kingdom of Bahrain.
The research indicates that there is a need for an action to be taken by the
decision makers and governance people in the Arab countries towards SN which
essentially have become a weapon of mass persuasion. This weapon can be used by
unconcerned individuals to change the society: socially, politically or even
economically. The impact of Social Networks on changing behavior should be
considered strategically. More attention needs to be paid on controlling the
information and material which are exchanged on Social Networks to redirect
their impact positively. |
Keywords: |
Social Networking, Behavioral Change, Perceived Privacy, Trust, Kingdom Of
Bahrain |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
NEAREST DRIVER-FIFO COMBINATION MODEL IN ONLINE MOTORCYCLE TAXI DISPATCH SYSTEM |
Author: |
PURBA DARU KUSUMA |
Abstract: |
Dispatch system is one critical aspect in online motorcycle taxi system.
Dispatch system affects both customer satisfaction and driver’s productivity. In
online taxi system, nearest driver method is applied in most dispatch system.
Meanwhile, FIFO method is applied in most taxi stands, such as in airport,
railway station, etc. In this paper, we propose new dispatch system by combining
nearest driver method and FIFO method. There are three combination methods. In
the first method, both FIFO and nearest driver methods are weighted and then
they are summed. Pickup request then will be allocated to the driver with the
highest value of the summation. In the second method, pickup request will be
allocated to the available driver with the highest idle time in certain
broadcast area. If there are more than one drivers, the pickup request will be
allocated to the nearest driver with the same idle time. In the third method,
both driver’s idle time and driver-passenger distance are divided into certain
classes. Each class is scored. Then, driver’s idle time score is summed with
driver-passenger distance score. Pickup request is allocated to the driver who
has highest score. Based on the simulation result, when the driver-passenger
distance factor is dominant, weighted nearest driver-FIFO method is better than
the existing nearest driver method. When the broadcast range is 0.5 kilometer,
serial FIFO-nearest driver method is better than the nearest driver method. |
Keywords: |
Motorcycle Taxi, Dispatching System, FIFO, Nearest Driver |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
NEW TEXT STEGANOGRAPHY TECHNIQUE BASED ON A SET OF TWO-LETTER WORDS |
Author: |
SALWA SHAKIR BAAWI, MOHD ROSMADI MOKHTAR, ROSSILAWATI SULAIMAN |
Abstract: |
Steganography is a secret writing wherein one person communicates with another
without drawing suspicion to the secret communication through the medium. Text
steganography is regarded the most difficult carrier to conceal secret data with
because of its insufficient redundant information compared to image, audio, or
video files. In this paper, we propose a new method for concealing information
in English writing using non-printing characters, such as zero width non-joiner
(ZWNJ) and zero width joiner (ZWJ). This approach uses to text steganography on
text files. Secret information is embedded inside the English script using
two-letter words based on their locations, hence achieving steganography.
Results show that the technique satisfies perceptual transparency and
information hiding capacity in the cover file by comparing with two previous
developed existing methods. However, the size of the cover and stego document
increases by approximately (22.61%) from the original size. |
Keywords: |
Data Security, Information Hiding, Text Steganography, Carrier File, Unicode |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
DETECTION OF BREAST CANCER IN MAMMOGRAMS THROUGH A NEW FEATURES AND DECISION
TREE BASED, CLASSIFICATION FRAMEWORK |
Author: |
ANWAR YAHYA EBRAHIM |
Abstract: |
This research proposes a new framework for detection of breast cancer.
Currently, mammography is the primary tool for early detection and diagnosis.
The use of computer systems to assist clinicians in digital mammography image
screening has advantages over traditional methods. Computer-aided techniques can
enhance the appearance of the mammogram images and highlight suspicious areas.
Also it extracts certain dynamic features to distinguish between benign and
malignant mammograms. Although great efforts have been made to come out with
effective methods, their performances especially in terms of accuracy are fallen
short due to poor image resolution, noise, and distinction between cancerous and
non-cancerous tumours is very subtle. Thus, this study presents an automatic
classification scheme to classify breast cancer into normal, benign and
malignant, which covers background detection, image enhancement, pectoral
muscles separation, selected features and classification processes. To this aim,
this framework uses set of various techniques. First step we have achieved
improvement on mammogram to improve the image accuracy based on this framework,
after new method has been used for features extraction. A new method named
Weighted Sparse Principal Component Analysis (WSPCA) is applied to select the
distinctive features of the images of mammogram. The analysed mammograms of
images are then identified as benign or malignant through decision tree by
comparing the performance of Decision Tree with Support Vector Machine (SVM) and
Bayesian classification on the MIAS data set. Decision Tree classifier is chosen
to classify the mammograms using the above features as its input. The
evaluations are carried out on the entire Mammography Image Analysis Society
(MIAS) standard dataset. The proposed framework tested on MIAS data set achieved
an overall accuracy of 90% with Decision Tree classifier and the perform
accuracy of 97.8% using WSPCA features with Decision Tree classifier for
sequential selection of benign versus malignant mammograms. Suggested method
achieves good results when we have verified on various mammograms. |
Keywords: |
Chest Cancer, Mammograms, Feature Extraction, Weighted Features, Classification
Techniques |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
SOCIAL DAMAGE COST ESTIMATION MODEL FOR MOBILE DIGITAL DIVIDE |
Author: |
GYOO GUN LIM, JOONGHO SEOL |
Abstract: |
The previous studies on the digital divide mainly consisted of comparative
studies among groups according to education, age, occupation, and region.
However, the digital divide is expected to vary depending on the individual
situation. Therefore, this study suggests a model that calculates social damage
cost of mobile divide at individual level based on individual cognitive value.
According to this model, we conduct empirical verification by social damage
types. For the empirical analysis of this study, 800 questionnaire data were
collected. Based on the collected data, the cost of social damage caused by the
individual mobile digital divide per person was estimated to be about 120,000
KRW, and the social cost of about 5.6 trillion KRW was calculated as a result of
applying it to the population of South Korea. In the detailed analysis, males
felt more harm than females, and people in their 40s or older felt more
vulnerable than those in their 30s or less. Also, non-metropolitan area
residents felt more damage than metropolitan area residents. In addition, the
results of social damage analysis showed that the mobile digital divide was more
significant in social participation activities, family issues and interpersonal
relationships, and medical services. The results of this study are expected to
contribute to the study of mobile digital divide and to the development of
meaningful policies. |
Keywords: |
Mobile Digital Divide; Digital Divide; Mobile Divide; Social Damage Cost; Cost
Estimation |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
INFORMATION MODEL TO SUPPORT SUSTAINABLE PROCUREMENT |
Author: |
EMELIA AKASHAH P.AKHIR, ROBERT T.HUGHES , KARL COX |
Abstract: |
Sustainable practices which increase awareness of people in industries which
deal with environmental issues need to be implemented. However, information that
needs to be considered to make informed decision on sustainable procurement is
located in different places and comes in different form, either internal or
external of the organizations. Thus, there is a strong need to consider this
information to be interrelated and put in one place for easy access. The aim of
this paper is to consider all of these concerns and develop information model to
support sustainable procurement. It is hope that this information model is able
to guide buyers in making the most informed decisions. |
Keywords: |
Sustainable Procurement, Information Model, Knowledge Management, Universities,
Informed Decision Making |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
CUSTOMER ENGAGEMENT IN SOCIAL COMMERCE: A THEORETICAL REVIEW |
Author: |
ABDELSALAM H H BUSALIM, AB RAZAK CHE HUSSIN, AHMAD FADHIL YUSOF |
Abstract: |
Social commerce (s-commerce) has changed both businesses and customers. The role
of customer has dramatically changed with the rapid growth of s-commerce.
Customer engagement behavior in the s-commerce context has become a key
competitive advantage for companies that aim to build a customer-centric
business and utilize the power of social media. Numerous studies have conducted
to understand the customer behavior to engage in s-commerce, but little effort
have been made to incorporate the previous studies to provide theoretical
foundation of customer engagement in s-commerce context. The aim of this study
is to identify the well utilized theories and factors that influence customer
engagement behavior in s-commerce. The study used a systematic Literature Review
(SLR) as a method to identify and analyze the theories and factors related to
customer engagement. The results shows that the theories which constitute the
foundation of customer engagement studies can be classified into: social related
theories, technological related theories, behavioral theories and motivational
theories, and most of the well utilized factors are tapped under theses
classifications. This study provides a comprehensive view of the theoretical
foundation of customer engagement studies in s-commerce, and provide theoretical
basis for IS research towards the development of empirical research on customer
engagement and s-commerce. |
Keywords: |
Social Commerce, Social Media, Customer Engagement Behavior, Systematic
Literature Review |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
A MOBILE BASED APPLICATION ON ENVIRONMENTAL EDUCATION FOR PRIMARY SCHOOLCHILDREN
IN MALAYSIA |
Author: |
K.S. SAVITA, NUR AIN ZAINUDDIN, MANORANJITHAM MUNIANDY, MAZLINA MEHAT |
Abstract: |
The ever-rising environmental issues on industrial pollution and daily life
activities are affecting our environment, and bringing negative implications
globally. The Malaysian government has taken steps to overcome these problems
through the implementation of 11th Malaysia Plan with focus to create green
environment and sustainable nation. To do so, Malaysian government needs the
cooperation and support from the society. An environmental caring society could
only be established if the environmental caring habit is cultivated at early
age. One of the methods is to nurture awareness on environmental friendly
activities through early education. However, environmental education is still
limited in the current syllabus of primary schools in Malaysia. The
environmental-based contents are included in language textbooks, which with
focus of educating the students on the languages and not on the importance of
caring for the environment. Thus, teachers are experiencing insufficiency of
appropriate teaching materials to deliver on environmental education. As one of
the intervention and, to assist in teaching and learning of environmental
education, a mobile based application incorporating basic environmental
education topics has been developed. It is the first version of the prototype,
“LOVE2GreenMY”, has been tested among Standard 4 primary schoolchildren from a
few schools in Kuala Lumpur and Perak. The schoolchildren and teachers have
provided positive feedbacks on the proposed solution, yet further improvements
on the content and design will be carried out in the second version of the
prototype. In the second version, greater scope of environmental topics and
game-based activities will be included. LOVE2GreenMY aims to be the platform
that educates environmental education, not only at schools but also at homes. |
Keywords: |
Environmental Education; Education for Sustainable Development; Malaysia; Mobile
Application; Schools; Children; Teachers |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
THE CHALLENGES OF EXTRACT, TRANSFORM AND LOAD (ETL) FOR DATA INTEGRATION IN NEAR
REAL-TIME ENVIRONMENT |
Author: |
ADILAH SABTU, NURULHUDA FIRDAUS MOHD AZMI, NILAM NUR AMIR SJARIF, SAIFUL ADLI
ISMAIL, OTHMAN MOHD YUSOP, HASLINA SARKAN, SURIAYATI CHUPRAT |
Abstract: |
Organization with considerable investment into data warehousing, the influx of
various data types and forms require certain ways of prepping data and staging
platform that support fast, efficient and volatile data to reach its targeted
audiences or users of different business needs. Extract, Transform and Load
(ETL) system proved to be a choice standard for managing and sustaining the
movement and transactional process of the valued big data assets. However,
traditional ETL system can no longer accommodate and effectively handle
streaming or near real-time data and stimulating environment which demands high
availability, low latency and horizontal scalability features for functionality.
This paper identifies the challenges of implementing ETL system for streaming or
near real-time data which needs to evolve and streamline itself with the
different requirements. Current efforts and solution approaches to address the
challenges are presented. The classification of ETL system challenges are
prepared based on near real-time environment features and ETL stages to
encourage different perspectives for future research. |
Keywords: |
ETL, Near Real-Time Environment, High Availability, Low Latency, Horizontal
Scalability |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
CO-DEPENDENCE RELATIONSHIP BETWEEN MASTER DATA MANAGEMENT AND DATA
QUALITY: A REVIEW |
Author: |
FAIZURA HANEEM, AZRI AZMI, NAZRI KAMA |
Abstract: |
Master Data Management refers to the consolidation, integration and
standardization of master data from multiple data sources into a centralized
system to support data quality improvement in an organization. Nevertheless,
while Master Data Management came into prominence in the information systems
field of study, there is a lack of review papers for this topic have been
published. Hence, this paper reports the results of a systematic literature
review on the Master Data Management research topic. It aims to summarize the
research progress of Master Data Management since 2000 to July 2016 and to
review the association of Master Data Management and Data Quality. Search
strategies with relevant keywords were used to identify literature from seven
prestigious academic databases, namely 1) ACM Digital Library; 2) Emerald; 3)
IEEE; 4) Science Direct; 5) Scopus; 6) Springer Link; 7) Web of Science, and one
industry research database, namely Gartner. Additionally, the study made use of
Google Scholar to find more related literature on the MDM research topic. From
the review, 777 articles were found during the initial search and 347 relevant
articles were filtered out for the analysis of MDM research progress. Then, out
of the relevant articles, 49 were selected to discuss the association of MDM and
Data Quality. This paper is a first academic systematic literature review on the
progress of Master Data Management and its association with Data Quality. The
result of the review shows that Master Data Management came into prominence from
2009 in parallel with the Big Data movement. Most researchers describe Master
Data Management as a means to resolve data quality issues encountered during the
management of multiple data sources. It ensures better data quality in the
organization by combining a set of processes, data governance, and technology
implementations. |
Keywords: |
MDM, Data Quality, Systematic Literature Review |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
Title: |
TAG BASED NOISE REMOVAL FROM WEB PAGES |
Author: |
MAYAJOHN, JAYASUDHA J S |
Abstract: |
Over years the web has evolved as the largest repository of information
available to mankind. Web pages have pieces of information which degrade the
performance of mining data. The aforesaid type of information is termed as web
noise which may be local or global in nature. A tag analysis based technique to
eliminate local noises from a web page is proposed in this paper. Irrelevant
images and links in a web page can be removed by analyzing the attributes and
content of tags. Noisy information is eliminated either by filtering the tags
representing noise or by modifying the attributes of tags. The contents in web
page which are considered as noise by the proposed work include image
advertisements, background images, unimportant link, search panel, copyright
information etc. The efficiency in removing image advertisements was analyzed in
terms of precision, recall and F-Score. The web pages after noise removal were
found to have good compression ratio and showed a significant decrease in load
time. As a result of removal of noise tags from web pages the size of source
code of the web pages were also decreased considerably. |
Keywords: |
Advertisements, Block, Noise, Tag, Web page |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2017 -- Vol. 95. No. 22 -- 2017 |
Full
Text |
|
|
|