|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Information Technology
June 2019 | Vol. 97
No.11 |
Title: |
LANGUAGE MODEL FOR DIGITAL RECOURSE OBJECTS RETRIEVAL |
Author: |
WAFA ZAAL ALMAAITAH, ABDULLAH HJ TALIB, MOHD AZAM OSMAN, ADDY ALQURAAN |
Abstract: |
Language model has been successfully applied for use in information retrieval to
retrieve structure and unstructured information. Typically, language model
involves three basic models namely: N-gram language models, smoothing model and
estimation model. Language model has been approved outperforms of other
retrieval model such as vector space model and probabilistic model. The problem
arises when language model uses to retrieve digital Resource Objects which use
metadata to describe their content. Digital Resource Objects have special three
characteristics: lack in metadata content (short document), short query, and
heterogeneity metadata content. This paper presents a performance comparison
among information retrieval models (Vector Space Model and Probabilistic Model)
using a Digital Resource Objects (CHiC2013 collection). Further, an overview for
language model approaches to determine which models are suitable for digital
Resource Objects, despite being a traditional review, a comprehensive
comparative analysis is conducted among different approaches of Language model. |
Keywords: |
Language Model, Information Retrieval, Digital Recourses Object |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
MULTIMODAL PALMPRINT TECHNOLOGY: A REVIEW |
Author: |
INASS SHAHADHA HUSSEIN , SHAMSUL BIN SAHIBUDDIN , MD JAN NORDIN , NILAM NUR
BINTI AMIR SJARIF |
Abstract: |
To increase security and accuracy in the systems which are based on biometrics,
a multimodal system was suggested. The multimodal biometric systems, are more
accurate and effective compared to the unimodal systems which are considered as
a wide research field these days. Multimodal biometric system aims to improve
the recognition accuracy by minimizing the limitation of the unimodal. This
paper focuses on the Palmprint Multimodal system; palmprint biometrics is
considered as one of the most popular biometric technologies to authenticate the
identity of a human. The aim of this paper is to introduce a comprehensive
investigation of a multimodal palmprint that focuses on feature level fusion.
Based on the review of the multimodal palmprint system, some suggestions have
been made that can be considered for future research to improve palmprint
multimodal. |
Keywords: |
Biometric System, Palmprint, Multimodal, Feature Level Fusion, Feature Selection |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
A MODIFIED BIT PLANE COMPLEXITY SEGMENTATION STEGANOGRAPHIC METHOD: INCREASING
PAYLOAD IMPERCEPTIBILITY AND ROBUSTNESS |
Author: |
GABRIEL KAMAU , WAWERU MWANGI , WILSON CHERUIYOT |
Abstract: |
Embedding of secret information in a digital image will definitely introduce
some noise or modulate the image signal in some way. A good steganographic
method ensures that such noise is not perceptible or is minimal in order to
maintain the fidelity of the vessel. The Bit Plane Complexity Segmentation
(BPCS) method uses the Canonical Gray Coded (CGC) bits of the complex bit plane
blocks of a vessel image for embedding secret information. Though this
guarantees a high payload capacity, it can potentially compromise the fidelity
of the vessel particularly in its high order bits. To ensure that the vessel is
evenly modulated and hence increase imperceptibility of the embedded data, this
paper suggests a tweaking of the BPCS embedding procedure by employing a random
selection of the CGC bits in the noisy regions of the vessel. Additionally, in
order to boost the vessel’s robustness against compression and other image
processing activities, the proposed embedding procedure does not utilize the 0
(zero) CGC bit plane, which is normally targeted for removal by such activities.
Results from the experiments carried out showed that stego images from the
proposed method had improved signal to noise ratios compared to those from the
traditional BPCS method. |
Keywords: |
Steganography, Fidelity, Floor Noise, Payload, Compression, Embedding Rate |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
USER ENGAGEMENT MODEL IN INFORMATION SYSTEMS DEVELOPMENT |
Author: |
GHASSAN A. O. ABUSAMHADANA, NUR FAZIDAH ELIAS, MURIATI MUKHTAR, UMI ASMA MOKHTAR |
Abstract: |
User engagement combines both factors of user participation and user
involvement. It indicates the behavioral and psychological activities of the
involved users during information systems development. Some previous researches
claimed that these two factors contribute positively to information systems
success. Nonetheless, a common understanding is still not tangible from the
literature on measuring and validating the effect of these two factors. This
paper proposes an integrated model for measuring and validating the factors of
user engagement success in information systems development. Therein, a
qualitative method is applied to verify the success of user engagement factors
and demonstrate their interrelationships. A-priori model of user engagement
success in information system development is then proposed and validated via a
quantitative method. Questionnaires were distributed to the users, developers,
system analysts, and managers who have engaged in information systems
development projects at the seven higher learning institutions in Malaysia.
Findings show that eight out of twelve user engagement critical success factors
are accepted while other rejected factors were Identifying User for Engagement,
Top Management Support, Type of User Participation, and User-Developer Attitude. |
Keywords: |
User Engagement; User Participation; User Involvement; Information System
Success |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
A REVIEW OF BLACK HOLE ATTACK STUDIES IN WIRELESS SENSOR NETWORK |
Author: |
ALI ALDUJAILI, NAJMULDEEN HASHIM, NURHIZAM SAFIE |
Abstract: |
The specific application of ad-hoc network is a form of the wireless sensor
network, that participates to achieve “smart sensing work” where the nodes are
“smart sensors”. This modern technology has been used innovatively in many
sensing applications and observations in various domains - medical, banking
security, and industry. MANET, referred to as a Mobile ad-hoc network, is a kind
of ad-hoc network which depends on a set of auto-configuring mobile wireless
points that are linked to change locations. for this large range of
applications, and has been a promising and innovative research scope. The
objective of the present study is to analytically review the previous studies
and propose a path for the future. The proposed method used in this paper to
analyze previous studies uses the content analysis approach. Furthermore, the
limitations of wireless sensor networks include memory storage, the power of the
battery, computational work, and range of communication in their system
resources, which outline their defense against threats and performance. Based on
the results, the present finding was that eleven types of research discussed the
black hole attack in a wireless sensor network in different case studies. |
Keywords: |
Black Hole Attack, Wireless Sensor Network, MANET, Ad-Hoc Network, System
Resources, Review. |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
QUERY OPTIMIZATION ON DISTRIBUTED HEALTH DATABASE DBD FOR SUPPORTING DATA CENTER
WITH MATERIALIZED VIEW AND MINIMIZING ATTRIBUTE INVOLVEMENT |
Author: |
SUDARYANTO , SLAMET SUDARYANTO N , FIKRI B , MARYANI S |
Abstract: |
The integration of data from various sources is an important step to establish a
data warehouse in order to form a decision support application. The problem is
how to find and integrate optimally the various data from distributed
heterogeneous database sources. The heterogeneity of data sources has a number
of factors, including storing databases in various formats, using different
software and hardware for database storage systems, designing in different data
semantic models. There are currently two approaches to data integration: Global
as View (GAV) and Local as View (LAV), but both have performance limitations and
need to find ways to optimize them. Some of the key factors to be considered in
making data integration optimal are query response time and understanding of
structure of the data source (source schema). Query response time plays an
important role as timely access to information and it is the basic requirement
of successful business application. A data warehouse uses multiple materialized
views (MV) to efficiently process a given set of queries. Query process requires
important attention especially in source schema, because the results of
cost-based query processes (access costs and stored costs) are influenced by the
involvement of the number of attributes and sites visited. This paper gives the
results of proposed minimize attribute involvement based MV selection algorithm
for query processing. First, select MV by clustering the workload of the query.
A query is decomposed into a sub-query that requires operations on a separate
database and can determine the exact order of site access. From the query
process query sequence, the operating costs for the query process will be
minimal. When a query process in a distributed database occurs, query operations
will look for data from various attributes in a scattered database table,
whereas query processes often do not require all the attributes of the tables.
Therefore, to optimize the query requires minimum operating cost requests
(access costs and stored costs) by separating the use of unnecessary attributes.
Second, a join index that is specifically adapted to the multidimensional
architecture of warehouses. It eliminates join operations while preserving the
information contained in the original warehouse. This approach can also to
minimize the cost of the request in addition to separating attributes that are
not required by the request, thereby reducing the amount of time store and
access. In the separation of attributes, attributes are shared indiscriminately,
because otherwise they will result in greater access fees and ultimately reduce
the performance of the query process. To perform such attribute separation can
be done by Vertical Fragmentation method. To validate this study, we measured
response times from a set of decision support Queries through DBD data
warehouse, with and without using our optimization techniques. Our experimental
results show their efficiency, even when queries are complex and the data is
relatively large. |
Keywords: |
Global as View, Local as View, Materialized View, Access Costs , Stored Costs,
Data Warehouse |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
IMPLEMENTATION OF RISK CONTROL SELF ASSESSMENTS USING RAPID APPLICATION
DEVELOPMENT MODEL IN BANK OPERATIONAL RISK MANAGEMENT PROCESS |
Author: |
AWAN SETIAWAN, ERWIN YULIANTO |
Abstract: |
Risk management is an important factor to run a bank business because of
business growth and the increasing complexity of bank activities which of course
is accompanied by the risk level faced by the bank. POJK no 18/POJK.03/2016
concerning about Application of Risk Management for Commercial Banks states
"Banks must implement an internal control system effectively for the
implementation of operational activities at all levels of the Bank's
organization". In particular, the Basel II Capital Accord defines operational
risk as the risk of losses arising from the failure or inadequate internal
processes, human resources, systems, and external events that affect the bank's
operations. In carrying out its business, banks offer financial services to the
Bank will receive and manage various types of risks to be controlled effectively
so they can avoid large losses. The Financial Services Authority Regulation
(POJK) describes the definition of risk management as a series of methodologies
and procedures used to identify, measure, monitor and control risks arising from
all bank business activities. One important issue in the context of operational
risk management is the need for assessment data carried out by the risk owner in
managing daily banking operational risk, to facilitate data management related
to risk mitigation so that similar events do not recur in the future. To achieve
this, it is necessary to develop a system that manages assessment data sourced
from the risk owner using tools called Risk Control Self Assessment. The Rapid
Application Development (RAD) model is used to refer to adaptive software
development approaches. RAD is very suitable to be used to develop software that
is driven by the requirements of Graphical User Interface and a short time
period. |
Keywords: |
Operational Risk Management, Risk Control Self Assessment, Rapid Application
Development |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
A REVIEW STUDY OF METHODS UTILIZED FOR IDENTIFYING AND SEGMENTING THE BRAIN
TUMOR FROM MR IMAGERIES |
Author: |
AHMED SAIFULLAH SAMI , MOHD SHAFRY MOHD RAHIM , FALAH Y H AHMED, AND GHAZALI BIN
SULONG |
Abstract: |
This paper provides a detailed analysis of the existent methods and approaches
utilized in medical image segmentation. Also integrate a comparative study of
the automated brain tumor coupled through the utilization of tuomr detection
techniques. Additionally, the paper will provide an analysis of the process
integrated pertaining to the retrieval of brain images through the
identification of the specific data sets selected in the process to identify the
stipulated features., [1]maintains that the utilization of a computer-aided
diagnosis in medical imaging influences the decision-making capacity of
specialists in the provision of accurate images pertaining to the existent
volume and distance. Also maintains that the detection of brain ailments remains
difficult among radiologists and neurologists due to the existence of similar
brain abnormalities. The existence variances influence the level of difficulties
experienced pertaining to the tasks due to the existence of differential
diagnosis coupled with the complexity attached to manual segmentation. The
processes delimit the accuracy levels leading to the identification of the need
for the provision of additional time considered necessary in meeting the
stipulated needs of the process. The process necessitates the integration of
extensive processes necessary in the identification of reliable and accurate
algorithms necessary in the provision of solutions pertaining to the existent
questions. This part of the study seeks to integrate an illustrative analysis,
which will provide illustration that will identify the existent needs relating
to automatic detection. The approaches will influence the evaluation processes
integrated pertaining to the existent human brain abnormalities including
injuries, tumor and edema. The process will also remain instrumental in the
identification of additional abnormalities that can be extracted from
computerized images of human brain. |
Keywords: |
Five Keywords are Required Separated By Commas (Capitalize Each Work Italic) |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
ENTERPRISE RESOURCE PLANNING IMPLEMENTATION SUCCESS FACTOR (A case study in Atma
Jaya Catholic University of Indonesia) |
Author: |
WELI |
Abstract: |
This research aims to analyze the systems implementation success factors,
benefits to users, and satisfaction of users in the implementation of Enterprise
Resource Planning (ERP) at Atma Jaya Catholic University of Indonesia, which
went live in June 2016. This new system was initiated by Atma Jaya Foundation to
improve the preparation of consolidated financial statements, which the previous
system could not accommodate. This change was executed despite the contentment
with the predating system. A problem occurred at the beginning of the new system
implementation, where users were challenged with the new interface and English
instruction. To avoid further delays, the work integration between all units was
postponed. The Atma Jaya Foundation decided to centralize data processing in two
units: Bureau of Accounting and Finance and Bureau of Facilities and
Infrastructure Management. This action caused further challenges that deserve an
investigation. We employ case study method to obtain description of user
responses and how university leaders managed the whole implementation process.
Data were collected through interviews with end-users and key system
implementers. We find that strong support from leaders, favorable project
management, and appropriate change management may anticipate user resistance.
Users also felt involved in the implementation project. Our analysis shows that
users are quite satisfied with the ERP performance. Problems at the beginning of
use were unraveled with assistance from vendor and university leaders. Finally,
users believe that ERP can be implemented properly and will benefit them in
daily operations. This research contributes to empirical and case studies on ERP
system implementation in educational institutions. The findings from this
investigation are expected to complement current empirical literature on ERP
system implementation success. |
Keywords: |
ERP, Implementation, IS Success Factor, Benefits, Satisfaction |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
IMPROVING AUDIO FILES SECURITY BY USING RIVEST SHAMIR ADLEMAN ALGORITHM AND
MODIFIED LEAST SIGNIFICANT BIT ON THE RED CHANNEL METHOD |
Author: |
DIAN RACHMAWATI, FEBRIYANA PRATIWI, SRI MELVANI HARDI |
Abstract: |
The audio is digital media that is widely used in the message exchange as it is
easy to use and can be accessed anywhere. The security aspect is very important
if the audio to be delivered contain secret information. Using cryptography,
message content can be hidden so that the delivery of a message to a recipient
can be more secure. The security of RSA (Rivest Shamir Adleman) lies in the
difficulty of resolving the private key. This is because looking for prime
factors of large integers is not easy. With a longer bit key, it becomes more
difficult to solve. In practice, these cryptographic algorithms have a weakness
that easily raises suspicions because of the character of the message changed or
scrambled into another form that is not meaningful. So that the confidentiality
of the message needs to be made safer, concealment of messages to address the
cryptographic algorithms that are used, the author combines the cryptographic
method with a method of steganography. MLSB method (Modified Least Significant
Bit) on the Red Channel is an improvement over the method of LSB, in which the
only one selected color channel to store the information. The bit that is used
is the last bit of the red channel value is binary. The results showed that the
combination of the two algorithms is successfully done and the possibility of
knowing the existence of an audio message in the picture is very small. |
Keywords: |
Rivest Shamir Adleman, Modified Least Significant Bit, Audio, Cryptography |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
ANALYSIS AND IMPLEMENTATION OF KNOWLEDGE MANAGEMENT SYSTEM (A CASE STUDY
APPROACH) |
Author: |
YOHANNES KURNIAWAN, FREDY JINGGA, NATALIA LIMANTARA, AMIR HAMZAH, HENDRA
FEBRIANSYAH, REYHAN SANDHYAJATI, DIEGO FERNANDO CABEZAS TAPIA |
Abstract: |
Knowledge Management System (KMS) is a system that applies and uses Knowledge
Management (KM) principles. KM implementation in XYZ Indonesia is performed with
requirement analysis, then the system is build using application service which
is close source. The method that is used for this KM analysis is SECI method.
The analysis and implementation of KM system show a positive result in the
creation of knowledge culture from knowledge sharing. From this result it is
expected for the company that will implement the KM system can use this as a
reference. |
Keywords: |
Implementation, Knowledge, Knowledge Management, System, Analysis |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
AUTOMATIC TEXT SUMMARIZATION OF INDIAN LANGUAGES: A MULTILINGUAL PROBLEM A
REVIEW OF MULTILINGUAL SUMMARIZATION TECHNIQUES |
Author: |
JOVI DSILVA, Dr. UZZAL SHARMA |
Abstract: |
Automatic text summarization is highly researched field. A lot of this research
is limited to popular languages such as English. In a nation like India there
are 22 languages spoken, which are written in 13 different scripts, with about
720 dialects. Taking this into consideration developing a nation-wide
summarization tool for India would be a very difficult problem. In this paper we
examine approaches to this problem and also highlight some existing research
that has been done in Indian languages. |
Keywords: |
Automatic Text Summarization, Indian Languages, Multilingual, Language
Independent |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
STUDYING OPEN BANKING PLATFORMS WITH OPEN SOURCE CODE |
Author: |
ANDREY KOLYCHEV, KONSTANTIN ZAYTSEV |
Abstract: |
Intensive growth of public web interfaces started early in 2010; and if
initially API was a procedure of interaction of various software tools, then at
present web interfaces are genuine digital products on the basis of which
companies, especially major companies, can derive profits while providing their
internal services to third parties via open API. Banks are not an exception.
They also can derive profits by providing access to their internal services for
third-party developers. The advantage of banking enterprises is that they
possess unique data and services, which can hardly be competed. As a
consequence, there appeared the software market for the development of open
source API and provision of access to them with monetization capabilities. API
management platform is comprised generally of three components: developer site,
API development tools, and API gateway. API gateway is the most important
component since it is responsible for interface operation; hence, this work is
aimed at the determination of the most efficient API gateways. Three software
variants have been considered: Gravitee API Platform, APIMan, and WSO2 API
Manager, which meet two preset criteria: Java product implementation, open
source code of the product. The study has been performed in comparison
environment with three coordinates: intensity of performed functions for API
development, labor intensity of API implementation, the performance of API
gateway. During the experiments, Gravitee.io API Platform was the best software
with regard to each coordinate. |
Keywords: |
API Management, API Management System, API Platform, API Manager, API
Gateway, Open API, Software Functionality, Performance. |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
PRELIMINARY INSIGHTS INTO THE CONCERNS OF ONLINE PRIVACY AND SECURITY AMONG
MILLENNIALS IN A DEVELOPING ECONOMY |
Author: |
ACHEAMPONG OWUSU, FREDERICK EDEM BRONI JNR, PRINCE KOBBY AKAKPO |
Abstract: |
Millennials, described as tech-savvies are knowledgeable and using the internet
for a couple of things ranging from social media, education, e-commerce,
entertainment and so on. However, there are growing concerns lately about online
privacy and security with the emergence of the Web 2.0 technologies that allow a
lot of multimedia online which is attractive to the millennials. Although there
is a threat to what the millennials are agreeing to and the information they
provide online, yet there are limited studies that have explored the online
privacy issues concerning millennials. Thus, in this study, we investigated
millennials and their awareness concerning online privacy threats and security,
whether they are bothered about what they put online, as well as measures they
have put in place to mitigate this menace from a developing economy perspective.
The study employed a quantitative approach where survey data was gathered
through self-administration questionnaires with a random sample of 700
undergraduate students in a public university in Ghana. The findings revealed
that most of the millennials are aware and concerned about online privacy
threats. They are also bothered about privacy effects concerning the information
they put online. Therefore, they wish there are laws and regulations protecting
consumers about their online privacy in Ghana. The originality of the paper
stems from the fact that there is paucity of research about online privacy
especially among the millennials in sub-Saharan African countries and our paper
is the first to be done in the Ghanaian context. |
Keywords: |
Online privacy, Millennials, Internet, Developing economies, Ghana |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
INCREMENTAL PARALLEL CLASSIFIER FOR BIG DATA WITH CASE STUDY: NAIVE BAYES USING
MAPREDUCE PATTERNS |
Author: |
VERONICA S. MOERTINI, MOHAMAD F. SEPTRIANTO, LIPTIA VENICA |
Abstract: |
Classification methods can be used to derive values from big data in the form of
models, which then can be utilized to predict new cases. Several parallel
classification methods for big data have been developed based on Hadoop
MapReduce as well as for Spark system. As big data keeps on coming, the models
must be updated from time to time to represent the old as well as the new data.
The computations must be efficient and scalable for handling big data. This
research aims to enhance the existing parallel classifiers such that they will
perform as incremental classifier handling batches of big data. The research
results are presented as follows. First, the architecture and main concept of
the enhancement is presented. Secondly, the proposed incremental parallel Naïve
Bayes classifier (NBC) based on MapReduce that handles dataset with discrete
attributes is discussed in detailed. Two series of experiment were performed on
Hadoop clusters with 5 and 10 nodes. The results show that the incremental
parallel NBC has acceptable accuracy, is efficient and scalable. |
Keywords: |
Big Data Classification Method, Incremental Parallel Classifier, Mapreduce
Patterns |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
INVESTIGATING THE PIECE-WISE LINEARITY AND BENCHMARK RELATED TO KOCZY-HIROTA
FUZZY LINEAR INTERPOLATION |
Author: |
MAEN ALZUBI, SZILVESZTER KOVACS |
Abstract: |
Fuzzy Rule Interpolation (FRI) reasoning methods have been introduced to address
sparse fuzzy rule bases and reduce complexity. The first FRI method was the
Koczy and Hirota (KH) proposed "Linear Interpolation". Besides, several
conditions and criteria have been suggested for unifying the common requirements
FRI methods have to satisfy. One of the most conditions is restricted the fuzzy
set of the conclusion must preserve a Piece-Wise Linearity (PWL) if all
antecedents and consequents of the fuzzy rules are preserving on PWL sets at
α-cut levels. The KH FRI is one of FRI methods which cannot satisfy this
condition. Therefore, the goal of this paper is to investigate equations and
notations related to PWL property, which is aimed to highlight the problematic
properties of the KH FRI method to prove its efficiency with PWL condition. In
addition, this paper is focusing on constructing benchmark examples to be a
baseline for testing other FRI methods against situations that are not satisfied
with the linearity condition for KH FRI. |
Keywords: |
Sparse fuzzy rules, FRI reasoning, Koczy-Hirota fuzzy interpolation, Preserving
piece-wise linearity, PWL benchmark |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
A MULTILEVEL PRINCIPAL COMPONENT ANALYSIS BASED QOS AWARE SERVICE DISCOVERY AND
RANKING FRAMEWORK IN MULTI-CLOUD ENVIRONMENT |
Author: |
A V L N SUJITH, Dr. A. RAMA MOHAN REDDY, Dr. K MADHAVI |
Abstract: |
With the rapid increase in the utilization of the cloud services, various cloud
service providers are keeping their efforts in the design and development of the
Quality of Service (QoS) aware composite services that satisfy the user
preferences. QoS aware cloud service discovery and selection is considered as an
NP-hard problem due to the existence of similar cloud services in different
cloud environments. Existing cloud service selection mechanisms adopt the
procedure of calculating the weighted summation of the QoS attributes to select
cloud services. But due to the lack of correlation between the QoS preferences
of the cloud service, these approaches may produce inaccurate results. In this
paper, a multilevel principal component analysis (PCA) based service selection
mechanism is proposed to discover and rank the services based on the user
preferences in a multi-cloud environment. Modified PCA based service agent is
deployed to select the services on analyzing the QoS correlations if each
service. Finally, the experimental results show that our proposed mechanism
outperforms the existing service selection techniques in terms of computation
time and reduction of discovery overhead. |
Keywords: |
Cloud Computing, Service Ranking, Principal component analysis, cloud service
selection, Quality of service |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
LOGS ANALYSIS TO SEARCH FOR ANOMALIES IN THE FUNCTIONING OF LARGE TECHNOLOGY
PLATFORMS |
Author: |
MAXIM DUNAEV, KONSTANTIN ZAYTSEV |
Abstract: |
Today, with the widespread use of machine learning methods in various fields of
human activity, the detection of rare events still remains one the most
challenging tasks. This is due to the fact that there is very little information
to learn computers to detect deviations from normal operation, although it has
to deal with the processing of very large amounts of data that characterize the
ongoing processes. This occurs, for example, in high energy physics, when
searching for and studying new particles. The similar situation occurs when
detecting pre-anomalous situations in the complex high-tech equipment operation.
Logs are the only source of information to detect the processes running on such
equipment, therefore many IT companies use them to analyze the functioning of
their software and hardware technologies. This allows viewing the logs starting
from very beginning to the point of failure completion, consistently figuring
out the possible causes of the incident. In most companies, this process is not
automated, because there is no single established approach to analyze logs of
different configurations of stored metric values and different filling
intensities. In addition, historical logs are not used to predict the sequence
of events that lead to anomalies in the operation of any software technologies.
The present article deals with the problem of detecting states and predicting
the nearest behavior of large technological platforms by directional analysis of
their logs. Usually, logs of large technology platforms represent data sets of
very high dimensionality that does not allow modern algorithms in the allowable
time limits to draw the necessary conclusions about the behavior of platforms
and form sequence of control actions, if necessary. To solve this problem, the
article compares the effectiveness of existing algorithms, traditionally used
unsupervised learning, because the available data for learning are too small, as
well as algorithms working with big data. Pilot implementations of all
algorithms involved in solving the problem, performed in Python programming
language, have been studied in a single environment. Based on their comparison,
the most efficient algorithm was chosen, when recognizing different types of
events based on real data. The solution of the chosen algorithm was implemented
using Apache Spark framework. Additional investigation has shown that the
selected algorithm can work in real time mode. |
Keywords: |
Logs, Technological Platform, Anomaly Detection, Machine Learning, Deep
Learning, Apache Software Foundation, K-Means, Clusterization, Apache Spark,
One-Class SVM, Isolation Forest, Elliptic Envelope. |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
IMPACT OF EFFECTIVE PERFORMANCE EXPECTANCY, EFFORT EXPECTANCY AND SOCIAL
INFLUENCE ON STUDENTS' BEHAVIOURAL INTENTION TO USE BLACKBOARD |
Author: |
AMJAD ALHARBI, NAHLA ALJOJO, AZIDA ZAINOL, ASMAA MUNSHI |
Abstract: |
Learning Management Systems (LMS), such as Blackboard, are widely used in
universities. However, many universities employing Blackboard have encountered
various difficulties during implementation, specifically in the domains of
effectiveness, acceptance of the LMS-based delivery of teaching courses, and
students’ Behavioural Intention (BI) to use the LMS. Conducted at the Faculty of
Computing and Information Technology (FCIT), King Abdulaziz University, Jeddah,
Saudi Arabia, this empirical study aims to investigate the impact of Performance
Expectancy (PE), Effort Expectancy (EE) and Social Influence (SI) on students'
BI to use Blackboard. This study identifies three hypotheses to validate the
effects of PE, EE and SI on the students' BI to use Blackboard. The participants
of this study were students at FCIT, King Abdulaziz University. The pilot study
sample consisted of 31 students. A questionnaire containing Arabic-language
questions adapted from the Unified Theory of Acceptance and Use of Technology
(UTAUT) instrument was distributed amongst the participants. The result
exhibited both validity and internal reliability; furthermore, a correlation
analysis validated the hypotheses, thus exhibiting that EE and SI do not have a
significant effect on BI. Conversely, PE was observed to positively influence
users’ BI. Although this study is a pilot study, the obtained results are
promising that able to conclude the hypotheses. |
Keywords: |
Learning Management System, Unified Theory Of Acceptance And Use Of Technology,
Behavioural Intention To Use. |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
FRAUD PREDICTION IN BANK CREDIT ADMINISTRATION: A SYSTEMATIC LITERATURE REVIEW |
Author: |
IBUKUN EWEOYA, AYODELE ADEBIYI A., AMBROSE AZETA, OKESOLA OLATUNJI |
Abstract: |
Any business or organization that intends to be far from bankruptcy or crime
strives daily to ensure crime perpetration does not occur in the organization
unabated. Traditional methods of fraud detection in credit administration are
available but limited in capacity to check current sophistication in fraud
perpetration; those approaches did not offer the best for time-consumption and
efficiency; also, frauds are better predicted rather than a detection after the
deal is done. This work presents an extensive review of literature and related
works in fraud prediction in credit administration. The primary focus of this
research work is to identify and dwell on the major concepts and techniques used
for financial fraud prediction in credit administration as well as related works
that have been done in this domain of study; while the work recommends the
ensemble approach as a better alternative in this domain. The existing
systematic literature reviews in this domain are not in the context of credit
fraud prediction alone. |
Keywords: |
Fraud, Supervised learning, Credit, Ensemble, Machine learning |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
USE OF PEAK-TO-AVERAGE POWER RATIO IN ORTHOGONAL FREQUENCY DIVISION MULTIPLEXING
TECHNIQUES: A REVIEW OF THE DRAWBACKS |
Author: |
A. ABDALMUNAM, MS ANUAR, MN JUNITA |
Abstract: |
The goal of this study is to provide readers knowledge of high. peak-to-average.
power ratio. (PAPR) that is the bottleneck of the orthogonal frequency division
multiplexing (OFDM) system. This study also highlights the drawback of the PAPR
reduction techniques and each of which are explain briefly. This review starts
with an illustration of the OFDM system and PARR. Introduction of several
techniques that can be classified into three parts (distortion signal,
scrambling signal, and coding) for compacting the above-mentioned problem of
high peak which is the objective of this study. Further in this paper most of
PAPR lessen techniques are elaborated with their drawbacks which illustrated in
detail. Each technique reduced the high output of power envelope fluctuation but
at the expense of reduced data rate, greater transmission of signal strength,
increased system intricacy and regression the performance bit-error-rate (BER). |
Keywords: |
MCM, OFDM, PAPR, CM, commanding, PTS, SLM, interleaved, TR. TI, ACE, Coding, PW,
ACF, |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
PREDICTION OF RESERVOIR PROPERTIES FOR BLIND WELL USING NEURAL NETWORK AND
SEISMIC KNOWLEDGE |
Author: |
S.A.BEDIER , M.A.EL-DOSUKY, M. Z. RASHAD , M.E.A.EL-MIKKAWY |
Abstract: |
Well Drilling costs a lot without knowing porosity distribution. Geoscientists
use the seismic waves to overcome this problem and reduce the exploration risk.
The current paper proposes a system to predict porosity of well from other wells
already drilled incorporating with seismic data. This proposed workflow aims to
estimate porosity values from three-dimensional seismic data and wells records
from F3-block North Sea data. We used porosity interpretations from two wells
(F2-1 and F3-2) and three-dimensional seismic attributes for neural network
training. for assessing the result of porosity prediction, we used data from
another well (F3-4) as a blind well. Correlation in the three stages of
training, validation, and testing are discussed. Test results indicate the
superiority of the proposed Neural Network to predict porosity compared to other
techniques in current use. By implementing Neural Network to predict porosity in
blind well it is found that correlation R=0.98. |
Keywords: |
Seismic attributes, Well logging, Neural Network, Porosity, Prediction. |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
THE IMPACT OF MANAGEMENT INFORMATION SYSTEMS ON ORGANISATIONAL PERFORMANCE WITH
TOTAL QUALITY MANAGEMENT AS THE MEDIATOR |
Author: |
REYATH THEA AZEEZ, KAMARUL BAHARI YAAKUB |
Abstract: |
The present study was conducted to bring attention to the importance of Total
Quality Management (TQM) in applying Management Information System (MIS) and
their effects on organisational performance, particularly in the oil sector. The
main objective of this research is an attempt to investigate the relationship of
MIS and organisational performance along with the mediating role of TQM at
Missan Oil Company in Iraq. The quantitative method applied the
questionnaire-survey and structural equation modeling (SEM). A total of 250
questionnaires were distributed and a high rate of return (87.6%) has been
achieved. After initial data screening, 201 responses were utilised to analyse
the final data. The results revealed that MIS indicators, namely information
quality, user satisfaction and net benefits are directly linked with
organisational performance. Meanwhile, TQM mediates the relationship between
five of the MIS indicators, namely system quality, information quality, use of
system, user satisfaction and net benefits with organisational performance.
The findings of this study will be useful for the Iraqi oil sector as it will
enhance their organisational performance through the use of appropriate MIS
indicators. |
Keywords: |
Management Information Systems, Mis; Total Quality Management, Tqm,
Organisational Performance, Missan Oil Company, Sem, Iraq. |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
Title: |
5G MOBILE SYSTEMS, CHALLENGES AND TECHNOLOGIES: A SURVEY |
Author: |
ABDULSATTAR M. AHMED, SALIM ABDULLAH HASAN, SAYF A. MAJEED |
Abstract: |
The huge increase of the data traffic was exited the borne of 5th Generation
(5G) mobile communication system looking at 10 Gbps data rate and around 1ms
latency. As the cellular data demand increasing, the actual 3 GHz spectrum band
becoming so crowded. This leads to look for a new allocated mobile communication
frequency bands that can offer a broadband amount of spectrum. In 5G mobile
system the ultra-wide millimeter wave (mmWave) spectrum will be adopted. mmWave
frequency band starting from 30 GHz to 300 GHz, constitutes a substantial
portion of the unused frequency spectrum, which is an important resource for
future wireless communication systems in order to fulfill the exponential demand
of capacity. In this paper, we provide a survey on the mm-Wave frequency band
general characteristics and its main challenges; we also state the required
technologies that were necessary for making 5G system as a real and efficient
solution. |
Keywords: |
5G, Millimeter Wave Communications, Path Loss, Beamforming, Small Cells,
Massive-MIMO, FBMC, Device-To-Device (D2D) Communication |
Source: |
Journal of Theoretical and Applied Information Technology
15th June 2019 -- Vol. 97. No. 11 -- 2019 |
Full
Text |
|
|
|