|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Information Technology
May 2023 | Vol.
101 No.9 |
Title: |
FPGA AND MACHINING PROCESS - A REVIEW ON PERFORMANCE AND APPLICATIONS |
Author: |
A.A. SOMANI, R. D. KOKATE , ABHILASHA MISHRA |
Abstract: |
Machining procedures should deal with contortions of work pieces, machine tools
and cutters, that are created by chatter vibration, thermal effect and cutting
force. Excellent performance of these sophisticated procedures relies on
information availability regarding process conditions. There is a need for
dependable systems that integrate signal processing techniques for gleaning
useful information. These are useful in the precise machining process, which
requires the tightest of tolerances. Machine performance control and prediction
become particularly challenging in such conditions of extreme change. Thus, for
identifying the actual tool health, a process monitoring or Tool Condition
Monitoring (TCM) system is required for various machining processes. Knowledge
of tool state or information regarding its condition during machining operations
aids in taking relevant actions for effectively performing machining tasks. This
paper explores the studies on TCM and investigates the diverse parameters and
techniques exploited for TCM in distinct machining operations. This article
provides the necessity and significance of FPGA-based monitoring mechanisms for
TCM. It investigates the studies on FPGA-based performance monitoring systems
utilized for diverse tasks and applications and elucidates the role of
FPGA-based performance monitoring systems for developing better and effective
future TCM systems useful in diverse machining tasks. |
Keywords: |
FPGA, Machining Process, Performance Monitoring, TCM, Tool Wear. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
A COMPARATIVE ANALYSIS ON THE COMBINED MULTI LEVEL FUNCTIONALITY FRAMEWORK IN
CLOUD ENVIRONMENT WITH ENHANCED DATA SECURITY LEVELS FOR PRIVACY PRESERVATION |
Author: |
K.SANTHI SRI , N. VEERANJANEYULU |
Abstract: |
Cloud Computing (CC) refers to a network of remote servers and user access via
the Internet. Distributed data centers all around the globe are responsible for
providing the infrastructure and hosting the servers that power cloud services.
The dynamic user group handling and securing their sensitive information using
cryptography model is a challenging task as the user groups are continuously
increasing. Encryption is a better answer for these kinds of problems, but
allowing access to users in the cloud has its own set of challenges. From the
client's point of view, when the information is saved in the cloud, it should be
crypted well to ensure that no other user can read it if they get access to it
in any way. Using cloud storage comes with a number of potential drawbacks,
including the lack of security for critical information. This study examines the
considerations that should be made when choosing a cloud service provider, with
a focus on the client's encryption needs and how, without them, the client runs
the risk of either losing data or paying more than necessary to the cloud
service provider. To enhance the privacy preservation and key management issues
a combined framework that handles data encryption, key management and
distribution, cloud user group management is analyzed in this research that
enhance the data security levels using cryptography model and the key management
and distribution process with the dynamic cloud user groups. This research
performs a comparative analysis by comparing the combined framework with
traditional models. The proposed model when compared with the existing methods
exhibit better security levels. |
Keywords: |
Cloud Computing, Privacy Preservation, Cryptography, User Groups, Data Security,
Encryption, Quality of Service. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
MOBILITY PREDICTION-BASED SOURCE ANONYMITY ROUTING PROTOCOL (MPSARP) FOR SOURCE
LOCATION PRIVACY USING NS2 TECHNIQUES |
Author: |
CHINNU MARY GEORGE, DIVYA SHARMA, REEJA S R |
Abstract: |
According to information and technology, the three most crucial criteria for
success are the three most essential factors in safeguarding privacy in the
twenty-first century. As a result, should keep a close check on all of the vast
organizations that are watching — companies like Google, Twitter, Facebook,
AT&T, and Verizon — making sure they aren't spying on. Every one of these
businesses has recently added location-based services to their product or
service offerings (or is doing so). Even though I don't reside in a vast
geographical area, Twitter can now know what city I'm in and what neighborhood
I'm in. Today, consumers face the reality of walking about with a beacon that
constantly transmits information about their location to a central server. While
capable of broadcasting exact details of our places and movements, cell phones
are not the only gadgets that can do so.To address these problems, the study
suggested MPSARP ensure location privacy across a wireless network (Mobility
prediction-based source anonymity routing protocol). Our proposal is for an
Unspecified, efficient routing and location-based system that offers good
anonymity protection at a reasonable cost while maintaining excellent
performance (MPSARP). By dynamically partitioning a network arena into diverse
zones and arbitrarily picking nodes inside zones to serve as intermediary relay
nodes, this approach provides a nontraceable unspecified route that is not
traceable by the user, resulting in a nontraceable unknown way that is not
traceable by the user. The process of simulation is accomplished using NS2 and
the performance outperforms the existing techniques. |
Keywords: |
Mobility, Source Location Privacy, Notify,Piggyback,Routing, Packet Deliver
Ratio, Speed, Flows, Delay, Unknown Neighboring Balance. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
EMOTION RECOGNITION IN ARABIC: A BERT-BASED TRANSFER LEARNING APPROACH
LEVERAGING SEMANTIC INFORMATION OF ONLINE COMMENTS |
Author: |
MAHA JARALLAH ALTHOBAITI |
Abstract: |
With a wide range of practical applications, such as the diagnosis of mental
health disorders and the detection of suspicious online behavior, the
recognition of emotions in textual data is a crucial task. However, Arabic
emotion recognition remains under-addressed in comparison to other languages,
primarily due to the scarcity of labeled data. Pre-trained language models and
transfer learning offer promising techniques to overcome the scarcity of labeled
data for downstream tasks, such as emotion recognition. In this paper, we
comprehensively explore the utilization of pre-trained Bidirectional Encoder
Representations from Transformers (BERT) models for Arabic fine-grained emotion
recognition. We further propose a straightforward yet effective method for
enhancing transfer learning in emotion recognition through the integration of
semantic information, specifically, information extracted based on sentiment
analysis and named entity recognition. To evaluate our proposed method, we
conduct experiments on an existing benchmark dataset using pre-trained BERT
models. The results indicate that our proposed approach of integrating the named
entity information with the BERT model yielded a high weighted-average F1 score
of 81.60%. Compared to the existing studies on Arabic emotion recognition in the
literature, our proposed method outperforms the state-of-the-art approach,
yielding of 9.80%, 9.72%, and 9.60% in weighted-average F1 score, precision, and
recall respectively. |
Keywords: |
Natural Language Processing, Emotion Recognition, BERT, Pre-trained Language
Models, Semantic Information. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
A FEATURE RECOGNITION SYSTEM FROM STEP AP203 NEUTRAL FILE, BASED ON STEP AP224
MANUFACTURING FEATURES |
Author: |
OUSSAMA JAIDER, NIHAD AGHBALOU |
Abstract: |
Geometric modeling has been used for many years as a design tool for CAD
(Computer Aided Design) systems. However, data definitions provided by these
systems cannot be used directly by a computer-aided process planning system. As
a result, the concept of Automatic Feature Recognition (AFR) has been introduced
to integrate CAD and downstream applications such as CAPP (Computer Aided
Process Planning). However, even though a feature recognition system is capable
of recognizing and extracting manufacturing features from a neutral file such as
STEP AP203, these features are still defined by geometric and topological
low-level data and thus, lack of standardization. For this reason, STEP AP224
application protocol has been developed to provide a standard for both implicit
and explicit representations for machining features. Nevertheless, recognizing
machining features of AP224 from a neutral file such as STEP AP203 is still the
subject of several research. The main purpose of this paper is to explain the
methodology of converting manufacturing features obtained by a recognition
system we have developed previously, to machining features defined according to
ISO 14649. The EXPRESS modelling, Python language and the STEPCode library are
the main tools used to map parameters of objects representing manufacturing
features obtained by the AFR system, to geometric and dimensional attributes of
classes representing features of ISO 14649. An example part was tested to show
the results of the developed application, called CAPP-Turn. |
Keywords: |
AFR, STEP AP224, ISO 14649, CAD/CAPP/CAM |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
PERFORMANCE COMPARISON OF DECISION TREE, RANDOM FOREST, AND XGBOOST MODELS; AND
ITS INTERPRETABILITY USING SHAP FOR RECOGNIZING THE NECESSITY OF CAESAREANS
SECTION OF CHILDBIRTH |
Author: |
ASWAN SUPRIYADI SUNGE, EDI ABDURACHMAN, YAYA HERYADI, IMAN H. KARTOWISATRO |
Abstract: |
Birth is one of the processes that every woman will go through after pregnancy,
many risks are faced from birth failure to death of the baby and mother. One of
the methods for reducing this risk is by Caesarean section, but before doing
this, medical personnel or obstetricians need medical records from the patient
before making the decision, from this analysis Machine Learning (ML) is needed
to analyze whether Caesar is needed or not for the candidate. mother to give
birth. The data used in analyzing the prediction of Caesarean birth with
external data which consists of 32 attributes totaling 3602 are then tested with
the Decision Tree, Random Forest, and XGBoost models then compared which one has
the highest performance of each model with the overall data and selected data
then tested with the model SHAP to see which features are highest in Caesar's
predictions. The test results using all the features show that Random Forest
(0.86) achieves the highest accuracy. However, using selected features such as
Age, Duration until the next pregnancy in days, and Obesity has the highest
accuracy, namely XGBoost (0.85). Finally, in the SHAP model, the highest feature
of the 32 features is ICD10O82 (Contractions but without any indication) and
ICD10O80 (Infection of the female reproductive organs). From the results of
testing with the ML model and it is also seen that the very domain features are
used as a guideline for cesarean prediction decision-makers. |
Keywords: |
Prediction, Caesar, Classification, Comparison, SHAP. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
SYSTEMATIC LITERATURE REVIEW ON IMPLEMENTATION OF WHISTLEBLOWING SYSTEM IN
PREVENTING FINANCIAL ACCOUNTING FRAUD |
Author: |
MEIRYANI, DANIEL, SHI MING HUANG, ASL LINDAWATI, DIANKA WAHYUNINGTIAS, AGUNG
PURNOMO, AGUSTINUS WINOTO, MOCHAMMAD FAHLEVI |
Abstract: |
The whistleblowing system is a part of internal control that has not been widely
discussed in accounting research in Indonesia. This study aims to determine the
influence of the factors that influence the implementation of the whistleblowing
system in supporting the prevention of financial/accounting fraud in the
corporate environment. In addition, this study wants to examine whistleblowing
policy as a moderating variable on the relationship between the factors that
influence the whistleblowing system and the implementation of good corporate
governance and their relationship with the audit committee. This study uses a
qualitative research approach using a systematic literature review to see if
there is a pattern of relationship between the factors that influence the
effectiveness of the whistleblowing system reporting and good corporate
governance. The implication of this research is that companies need to ensure
the safety of whistleblowers in the whistleblowing system. |
Keywords: |
Whistleblowing System, Internal Control, Good Corporate Governance, Reporting
Fraud |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
PADDY GRAIN MATURITY ESTIMATION USING DEEP LEARNING APPROACH- YOLOMATURITY |
Author: |
JAY PRAKASH SINGH, CHITTARANJAN PRADHAN |
Abstract: |
The detection of crop maturity is one of the most crucial parts of precision
farming. Crop maturity estimation can support farmers to harvest their crop at a
proper time. Crop harvesting in premature/post mature can lead to crop quality
degradation. We have proposed an approach using deep learning that can automate
the process of paddy grain maturity detection. Our proposed approach
YOLOMaturity (YOLOResNet70Combined) has performed well on available dataset. We
have compared our approach with the modifications of the YOLO algorithms, and it
outperformed other discussed approaches. Proposed approach yield 10.3 %, 9.6 %,
12.4%, 12.1%, 8.4%, 6.5% and 9.3% better mAP(Mean Average Precision) compared to
YOLOv4RCombined, YOLOv4FPN, YOLOv5sFPN, YOLOv5sR,
YOLOResNet43HardSwish,YOLOResNet43Combined and YOLOResNet70HardSwish
respectively. Proposed approach furnished validation accuracy 97.7 %, training
accuracy 97%, validation loss 7.6% and training loss 10%. These results indicate
that our proposed approach have promising result and can be utilized for
automatic real-time paddy grain maturity. |
Keywords: |
Precision Farming, Deep Learning, Maturity Detection, Paddy Grain, YOLOResNet70 |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
FORECASTING SOLAR PHOTOVOLTAIC ENERGY PRODUCTION USING LINEAR REGRESSION-BASED
TECHNIQUES |
Author: |
SHEREEN A. EL-AAL, MAHA ABDULKARIM ALQABLI, AMANY A. NAIM |
Abstract: |
Photovoltaic (PV) systems are now viewed as being crucial to the advancement of
renewable energy. In order to maximize the performance of the power grid in
accordance with market demands and prevent issues with solar generation due to
instability, smooth solar energy generation requires precise and trustworthy
forecasts. In order to estimate the solar system's output power, machine
learning regression models were used to assess the solar system's performance.
For the solar generation dataset, machine learning models based on linear
regression, such as Multiple, Ridge, Lasso, Decision Tree, and Polynomial
regression, have been applied. Performance measures were used to quantify how
well the behavior of the suggested technique fit the dataset. The study's
findings demonstrate a 93.7% accuracy rate for the polynomial regression model
with a lower value of Mean Square Error (MSE). |
Keywords: |
Photovoltaic, Solar Energy, Machine Learning, Regression, Polynomial Regression. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
APPROACH TO ATTRIBUTED FEATURE MODELING FOR REQUIREMENTS ELICITATION IN SCRUM
AGILE DEVELOPMENT |
Author: |
KARAM IGNAIM, SULTAN M. AL KHATIB, KHALID ALKHARABSHEH, JOĂO M. FERNANDES |
Abstract: |
Requirements elicitation is a core activity of requirements engineering for the
product to be developed. The knowledge that has been gained during requirements
engineering about the product to be developed forms the basis for requirement
elicitation. The agile approach is becoming known day by day as the most widely
used innovative process in the domain of requirements engineering. Requirements
elicitation in agile development faces several challenges. Requirements must be
gathered sufficiently to reflect stakeholders' needs. Furthermore, because of
the development process, requirements evolve, and they must be adequately
treated to keep up with the changing demands of the market and the passage of
time. Another challenge with agile implementation is handling non-functional
requirements in software development. Addressing non-functional requirements is
still a critical factor in the success of any product. Requirements
prioritization is also one of the most challenging tasks, and it is uncommon for
requirement engineers to be able to specify and document all the requirements at
once. This paper presents an approach for requirements elicitation in
scrum-based agile development. The approach operates with the feature modeling
technique, which is originally used in the Software Product Line (SPL). One of
the most important proposed extensions to Feature Models (FMs) is the
introduction of feature attributes. Our method uses attributed FMs to consider
both functional and non-functional requirements as well as requirement
prioritization. For the evaluation purposes, we have demonstrated our approach
through two case studies in different domains of software product development.
The first case study is in the domain of education, and the second one is in the
domain of health care. The results reveal that our approach fits the
requirements elicitation process in scrum agile development. |
Keywords: |
Agile Methodology, Scrum, Attributed Feature Models, Requirements
Elicitation, Requirement Engineering, Story Cards. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
BLOCKCHAIN TECHNOLOGY IN DIGITALIZATION OF RECORDING ACCOUNTING TRANSACTIONS |
Author: |
MEIRYANI, MARCELINO, TOTO RUSMANTO, THERESIA LESMANA, MOHAMAD IKHSAN MODJO,
ADELIA YULMA BUDIARTO |
Abstract: |
In the industrial era 4.0, the newest digital currency payments emerged, namely
cryptocurrency. Cryptocurrency uses blockchain technology to manage and record
transactions, so that the financial transaction system does not require a third
party as an intermediary and makes transactions more transparent. Blockchain has
an impact on the audit process carried out by the auditor. The purpose of this
study is to examine the impact of applying blockchain technology in digitizing
accounting records. The research method used is descriptive qualitative by
finding sources of data from literature, articles, reputable international
journals. This research shows that blockchain technology has the potential for
implementation in the fields of accounting, auditing and finance. The results of
this study indicate that the positive impact of blockchain in digitizing
accounting records is the freedom of access to view key records so that auditors
can easily authorize transactions, the ease of validating transactions, the
guarantee of integrity and reliability of data, the verification process that is
carried out automatically , obtain data directly without third parties, ease in
providing financial report assessments, have big data analytical features that
are guaranteed to be correct, and make the accounting process more extensive and
real-time. |
Keywords: |
Blockchain, Corporate, Accounting, Digitalization, Transactions |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
CLASSIFICATION APPROACH TO PREDICT CUSTOMER DECISION BETWEEN PRODUCT BRANDS
BASED ON CUSTOMER PROFILE AND TRANSACTION |
Author: |
LAURA LAHINDAH, IVAN DIRYANA SUDIRMAN |
Abstract: |
Businesses need to be able to anticipate what products their customers will buy
so that they can better respond to changing market demands and consumer tastes.
The purpose of this study is to employ several machine learning models that can
reliably estimate the customer's likelihood of purchasing the product given a
customer's profile, transaction date, and other transaction information. This
was achieved by training and evaluating different machine learning techniques,
such as naive bayes, linear models, deep learning, and decision trees, on a
dataset consisting of actual transaction data from three months of sales at a
medium-scale grocery store in Bandung. Results indicated that naive bayes
performed best as a prediction algorithm, this study shows that data mining can
be used to predict grocery store datasets. This research provides insights into
how machine learning can be used to improve businesses' ability to anticipate
consumer behavior and respond to changing market demands. We also found that
demographic factors like age and location, as well as contextual factors like
time of week, significantly influenced customers' propensity to buy. |
Keywords: |
Data Mining, Machine Learning, Classification, Naďve Bayes, Customer Product
Decision |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
AN OPTIMIZED APPROACH FOR THE DEVELOPMENT OF HIERARCHICAL ENERGY INTEGRATION
WITH MULTIPLE ENERGY RESOURCES AND EMPHASIZING THE HEURISTIC TECHNIQUES |
Author: |
V. SAI GEETHA LAKSHMI, M. DEVIKA RANI, SRAVANTHI KANTAMANENI,PUCHA-NUTHALA
SIVAKRISHNA, Dr. D. N. V. SATYANARAYANA, MUTHUKUMAR PARAMASIVAN |
Abstract: |
Traditionally, electricity has been produced at massive central power plants and
distributed to consumers via transmission and distribution systems. However,
dispersed generation is quickly becoming the norm. It is called distributed
generation when several individual generators are linked together at the
distribution level, near the end user. However, this raises new research
questions for engineers to consider. Protection and control, self-excitation,
and isolation are only a few examples of the difficulties that can arise. In
order to investigate these concerns thoroughly, engineers must first configure
the foundational simulation model to correctly reflect the distributed
generation system. Various renewable energy sources, including wind tu- bines
(Type 1 with soft starter capability and Type 3 based on a synchronous
generator), solar panels (Pho-tovoltaic Array), and small hydroelectric dams
(Synchronous Generator), are evaluated and rated for use in the construction of
a microgrid. Here hierarchical methods with heuristic techniques such as harmony
search algorithm, simulated annealing algorithm are considered and these
algorithms based optimized con-trollers are used for controlling the real and
reactive power of various energy sources. This model was cre-ated digitally
using PSCAD/EMTDC. |
Keywords: |
Hierarchical Methods, Multiple Energy Sources, Protection, Heuristic Techniques,
Optimized Controllers, Simulation. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
ANALYTIC APPROACH OF PREDICTING EMPLOYEE ATTRITION USING DATA SCIENCE TECHNIQUES |
Author: |
R. VINSTON RAJA, A. DEEPAK KUMAR, DR. I. THAMARAI, S. NOOR MOHAMMED, R. RAJESH
KANNA |
Abstract: |
Employee turnover has turn out to be a large venture for data technological
know-how companies. The departure of key software program builders would
possibly reason large loss an IT business enterprise in view that they
additionally leave with essential commercial enterprise understanding and
integral technical skills. It is fundamental for IT companies to apprehend
developer turnover in order to keep certified builders and reduce injury due to
developer exit. In this research, monthly self-report of the software developers
includes developer’s activities, working hours, no of projects they have been
assigned etc. will been taken into account for analysis for doing the prediction
with the help of data science algorithm. By the usage of NB algorithm, KNN
algorithm and SVM algorithm, prediction mannequin has been in contrast on the
experimental groundwork and supply the end result of which algorithm is
performing better. Then, this fantastic mannequin will be given to HR managers
to predict whether or not the worker will depart the corporation or not. |
Keywords: |
Employee Attrition, Naďve Bayes, K-Nearest Neighbor, Support Vector Machine,
Turnover Rate |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
STUDY OF E-FILING TAX APPLICATION ACCEPTANCE IN YOGYAKARTA DURING THE COVID-19
PANDEMIC |
Author: |
DIANA AIRAWATY, WAHYU WIDARJO, RAHMAWATI RAHMAWATI, ARI KUNCARA |
Abstract: |
Purpose - This exploratory research purpose is to determine tax-payers
acceptance of e-filing tax system during Covid-19 pandemic in Indonesia.
Design/methodology/approach-The research was conducted on individual taxpayers
who live in Yogyakarta for their personal tax reporting using the e-filing tax
application. from December 2020 to January 2021 using the google form. The
analytical tool used is Smart PLS and using Unified Theory of Acceptance and Use
of Technology as a research model. Finding – Performance Expectancy have
significant effect on Behavior Intention meanwhile Effort Expectancy and Social
Influence have no effect on the intention of using E-filing taxes for individual
taxpayers in Yogyakarta. Surprisingly, Facilitating Condition has a positive and
significant effect on the intention to use the tax E-filing for personal taxes
report and have a positive and significant effect on performance expectancy and
effort expectancy, meanwhile social influence has no effect on effort
expectancy. Research limitation/implication-This study uses processed data
from questionnaires. Differences in respondents' perceptions may occur when
answering the questionnaire. Originality - This study develops the UTAUT
theory and provide evidence through empirical research to support the finding |
Keywords: |
Tax E-filing, Personal Income Tax Report, UTAUT |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
ENERGY EFFICIENT HIERARCHICAL AGGLOMERATIVE DATA AGGREGATION IN WIRELESS SENSOR
NETWORK |
Author: |
A.VIMALATHITHAN, Dr.A.SURESH |
Abstract: |
In wireless sensor networks (WSNs), hierarchical organization structures enjoy
the benefit of giving flexible and resource-efficient arrangements. To track
down an efficient method for creating clusters, this paper uses a proposed
Hierarchical Agglomerative Clustering algorithm by proposing an Energy Efficient
Hierarchical Agglomerative Data Aggregation (EEHADA) model. EEHADA model for
wireless sensor networks has been proposed. This is an energy-efficient
clustering algorithm that utilizes hand-off hubs, variable transmission power,
and single message transmission per hub for setting up the group. The problem
occurs when sensors transmit data; they use energy in transmission, and
uploading the data directly to the sink may require long communication ranges
and so degrade the energy of sensors. Indirect communication, in contrast, the
data is uploaded to the sink via multiple intermediate sensors, which results in
short communication ranges and guarantees the energy efficiency of the sensors.
The proposed model is compared with the existing circulated clustering
algorithms Drain and HAAS. Simulation results show a significant improvement in
the average of the proposed model. The proposed model can be implemented in
multi domain in any WSN platform. |
Keywords: |
Wireless Sensor Network, Energy Efficient Hierarchical Agglomerative Data
Aggregation, Clustering Algorithm. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
MENSEMBLE FEATURE SELECTION AND ENSEMBLE DEEP LEARNING (EDL) CLASSIFIER FOR
PARKINSONS DISEASE |
Author: |
B. SABEENA, S. SIVAKUMARI |
Abstract: |
PDs (Parkinsons diseases) are chronic neuro-degenerative conditions that impact
humans in their day to day lives. Diagnosis and monitoring of these conditions
based on limited physical symptoms are painstaking evaluations for medical
professionals and clinicians may miss early prodromal phases. Though many DMTs
(data mining techniques) for automated assessments of PDs have recently been
presented, their performances get reduced due to dataset’s irrelevant features.
EFSs (Ensemble Feature Selections) have more benefits than single FSAs (Feature
Selection algorithms) as they address drawbacks by mixing different models and
improve outcomes of MLTs (machine learning techniques). This work uses OBEFSs
(Optimization Based Ensemble Feature Selections) including FMBOAS (Fuzzy Monarch
Butterfly Optimization Algorithms), LFCSAs (Levy Flight Cuckoo Search
Algorithms), and AFAs (Adaptive Firefly Algorithms) for selection of features
based on their correlations. On selection of features, EDL (Ensemble Deep
Learning) classifiers classify PD datasets. EDLs include FCBi-LSTMs (Fuzzy
Convolution Bi-Directional Long Short-Term Memories), CAEs (Contractive
Auto-encoders), and SAEs (Sparse Auto-encoders). CAEs are robust variant of
standards of auto encoders which learn representations with reduced
sensitiveness to small variations of data. SAEs are used to train classifiers
using NNs (neural networks) for identifying PDs from datasets. Stacked
generalization is used to combine the results of DL classifiers. When compared
to a single model, EDL techniques offer improved predictive performances. The
datasets used for this study were obtain ed from machine learning repositories
of UCI (University of California-Irvine). The performance of the classifiers
were measured using accuracies, F-measures, MCCs (Matthews Correlation
Coefficients), and errors. |
Keywords: |
Parkinsons Disease (PDs), Optimization Based Ensemble Feature Selection (OBEFS),
Levy Flight Cuckoo Search Algorithm (LFCSA), and Adaptive Firefly Algorithm
(AFA), Ensemble Deep Learning (EDL) Classifier, Fuzzy Convolution Bi-Directional
Long Short-Term Memory (FCBi-LSTM), Contractive Autoencoder (CAE), and Sparse
Autoencoder (SAE). |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
WALL PATTERN DETECTION WITH PRIM’S ALGORITHM TO CREATE PERFECT RANDOM MAZE |
Author: |
AGNESIA, WIRAWAN ISTIONO |
Abstract: |
Replayability is one of the factors that determines how a video game played
again by players, with one of the example is by offering new and unique content
into a game. One of the methods that can be used is by implementing a map
generation into the game level via Procedural Content Generation or PCG. With
Prim’s Algorithm as the base of the PCG, this research will design and develop a
game with mazes as its map level. Focusing on their maze generation, this
research will also on the result and will try to detect and display the data of
mazes generated, while also trying to determine video game satisfaction from
players that will be playing the game developed via Game User Experience
Satisfaction Scale or GUESS. A Detect Wall Pattern method is developed to detect
the pattern of each mazes’ grids, where the data will be documented and then
processed to determine the result of 250 maze generations of mazes with size
2x2, 3x3 and 4x4. Based on the research, MazeGame has succeeded on being
developed with PCG feature based on Prim’s Algorithm. Detect Wall Pattern method
has also been developed successfully, where this method successfully detecting 4
unique patterns for 2x2 size mazes, 79 patterns for 3x3 size mazes, and 243
patterns for 4x4 size mazes from 250 maze generation on each size. |
Keywords: |
Replayability, Procedural Content Generator, Prim’s Algorithm, MazeGame, GUESS,
Detect Wall Pattern |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
VISUAL SEMIOTICS ANALYSIS OF OMICRON VARIANT ON INSTAGRAM SOCIAL MEDIA MANAGED
BY THE MINISTRY OF HEALTH OF THE REPUBLIC OF INDONESIA |
Author: |
FARCHAN NOOR RACHMAN, MUHAMMAD ARAS, VIRIENIA PUSPITA |
Abstract: |
Instagram has been used by the Ministry of Health of the Republic of Indonesia
to provide information regarding the Omicron variant of the Covid-19 virus. This
is done based on the principle of risk-based communication. It aims to reduce
the spread of the Omicron variant of the Covid-19 virus and increase public
awareness. The Instagram contents contain visual and verbal messages that focus
mainly on spreading information about the new Omicron variant by presenting
visual symbols to the public. Through a study using the structuralism method by
Ferdinand de Saussure, the visual message is interpreted to give and capture
signs in the visual as a representation to get the complete information from the
visual message. The structuralism method in analyzing Instagram content of the
Ministry of Health of the Republic of Indonesia uses analysis of signifier and
signified in each visual and verbal content. The results of this structuralism
method show that semiotically, the delivery of messages from the Ministry of
Health of the Republic of Indonesia has a key message in the form of increasing
public awareness of the spread of the Omicron variant. Furthermore, socializing
messages through Instagram content semiotically also includes symbols of
Indonesian local wisdom so that the messages given are relevant to Indonesian
people. |
Keywords: |
Social Media, Instagram, Omicron, Structuralism, Semiotics |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
THE EFFECT OF SYSTEM ANALYSIS AND PRODUCT IN DATA PROCESSING ON INFORMATION
QUALITY OF FINANCIAL STATEMENTS |
Author: |
MEIRYANI, RAINA RAHMADANI, HOLLY DEVIARTI, EVI STEELYANA W, FAHRY PRIANDHANA,
HUGO PRASETYO, ENGELWATI GANI |
Abstract: |
Quality accounting information is needed in a company. By producing accounting
information that is accountable or open in company information, especially in
its financial information to external parties and internal parties. This
openness will attract stakeholders' trust in the company. The company's
management needs information that can support them in decision making. One of
the decisions regarding accounting information. Accounting information systems
play a very important role in controlling and securing company assets. This
study aims to find out the effect of system analysis and product in data
processing (SAP) on quality information of financial statement. The variables
tested were system analysis and product in data processing (SAP) on quality
information of financial statements. data that collected is secondary data with
the method of documentation. The method used in this research is descriptive
method. The results in this study system analysis and product in data processing
(SAP) affect the relevance of accounting information. The benefits of
integrating SAP system modules can save time in every cross-functional business
process. This time saving can make companies report their financial statements
in a timely manner. |
Keywords: |
System, Data Processing, Information Quality, Financial Quality, Financial
Statements |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
ANALYSIS OF BEST FIRST SEARCH AND FORWARD CHAINING ALGORITHM ON DIAGNOSIS SYSTEM
OF BROILER CHICKEN DISEASES |
Author: |
HANDRIZAL, FAUZAN NURAHMADI, IGO FERDINAND SIHOTANG |
Abstract: |
Broiler chicken farming is among the most profitable business in Indonesia.
Gotland Simatupang Farm is one of the broiler farms in Indonesia. The mortality
rate of chickens on this farm is still relatively high. In the last three
months, the average mortality proportion was 109.30% whereas in January the
total of chicken deaths was 43 chickens and in March the total chicken deaths
soared to 90 chickens. After observing the death of chickens, it was caused by
several factors. The main cause was disease. Based on the results of discussions
with the management of the farm, it was found that there was a lack of
veterinarians on the farm. Therefore, a system was built in this study that can
help farmers diagnose broiler disease using the Best First Search (BFS) method
and the Forward Chaining method. The types of diseases that can be diagnosed in
this system only consist of 5 diseases, namely Newcastle Disease, Infectious
Bursal Disease, Pullorum, Avian Influenza, and Snot. The system output is the
dis disease-affected broiler chickens. System testing is done by comparing the
test results with existing sample data. In testing on 20 samples, the actual
score % of the accuracy of diagnostic results in the system using the Best First
Search method by 70% while using the Forward Chaining method by 90%. Therefore,
it can be said that the Forward Chaining method is more accurate than Best First
Search. |
Keywords: |
Best First Search, Forward Chaining, Broiler, Disease |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
ESTIMATION AND MITIGATION OF DIFFERENTAIL MODE NOISE USING EMI FILTERS FOR POWER
CONVERTERS |
Author: |
PATHALA VENKATA SAI CHARISHMA, PAPPU.V. Y JAYASREE |
Abstract: |
The Electro Magnetic Interference [EMI] problem occurs in electronic devices due
to change in switching speed of electronic components. The interference occurs
in electronic devices produces noise inside the circuit, the noise decreases the
performance of the system and also destroy the components inside the circuit.
The noise generated by the equipment has some specific limits as per CISPR STD,
so the design of filter has to reduce the noise as per CISPR limits. In this
paper Single Ended Primary Inductor Converter [SEPIC] acts as a noise source
which is a DC-DC power converter used in many digital applications. The
designing of EMI filter for elimination of differential mode noise generated by
the power converter by using MULTISIM software and MATLAB. The filter design is
made up of passive components called inductors and capacitors. The total setup
for estimation of noise from equipment and mitigation of noise with filter
consists of Line Impedance Stabilization Network [LISN], Equipment [SEPIC],
Noise Separator [Active and Passive noise separators], Power Line filter, PI
filter. The LISN is used to produce the constant output impedance in the
circuit. The noise generated by the power converter is measured by the noise
separators used in the measurement setup. The noise obtained at the output of
active and passive noise separator is same with value of 119 dBµV. So, the
filter is designed to reduce the noise generated inside the circuit the noise at
the output of power line filter is 51.5 dBµV and with PI filter is 52.2 dBµV.
The comparison between the filters and calculation of noise from equipment using
two type of noise separators are shown in this paper. |
Keywords: |
Electro Magnetic Interference [EMI], Single Ended Primary Inductor Converter
[SEPIC], Power Converters, Line Impedance Stabilization Network [LISN], Noise
Separator, Power Line Filter, PI Filter. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
A STUDY ON AWARENESS AND USAGE OF ICT AND OPEN EDUCATIONAL RESOURCES (OERS)
AMONG COLLEGE STUDENTS IN RURAL PARTS OF INDIA |
Author: |
NAGAIAH M, Dr. THANUSKODI S |
Abstract: |
This study aims to investigate college students' awareness, availability, and
usage of ICT devices and open educational resources (OERs) for academic
purposes. The researchers collected data from a stratified random sample of 600
students from rural areas in the Pudukottai district of Tamil Nadu, India, using
a questionnaire. The survey found that most students were indirectly using OER
in various forms but were not aware of it. The data was analysed using
descriptive statistics, and it was found that while students had access to
electronic devices and the internet, a lack of awareness about open educational
resources was a significant issue. They reported being aware of OERs such as
digital learning objects, open-access journals, streaming videos, and learning
materials and frequently using them to get relevant learning material, stay up
to date on their subject, know the trends in technical fields, watch teaching
and learning sessions on online videos, and assist their personal learning.
However, they faced challenges such as high internet costs, a lack of technical
knowledge, poor network connections, the inadequate availability of hardware and
software, and a lack of OERs in their subjects. The report suggests that, while
they were aware of some OERs, they were not utilising them to their full
potential. The study also found that guidance from staff, social media, and help
from friends were the main ways students learned about OERs, and that teachers
can greatly help by spreading more information about open educational resources
to their students. |
Keywords: |
Information and Communication Technology, Open Educational Resources, Open
Access, Open Education, 5R, Sustainability of OER. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
A 120 GHZ DOWN CONVERSION MIXER DESIGN FOR IMPROVED LINEARITY, HIGH
CONVERSION-GAIN AND LOW NOISE-FIGURE IN 130 NM CMOS TECHNOLOGY |
Author: |
ZOHAIB HASAN KHAN, SHAILENDRA KUMAR, DEEPAK BALODI, PIYUSH CHARAN |
Abstract: |
Future mobile networks, like 6G, will use frequencies above 100 GHz. Mobile
networks target higher radio frequencies because they provide more bandwidth and
capacity. Higher frequencies increase power consumption. This could be a problem
for 6G battery-powered phones. High linearity, high conversion gain, and low NF
are also challenges. Although Gain, power, linearity, and noise figure are
trade-offs. More attention should be paid to the linearity of mixers in the
design of wireless receivers because their input signal level is above 100 GHz,
which is much higher than any other amplifier in the receiver. This study
proposes a Gilbert cell mixer using BALUN to increase linearity, CG, and NF. 120
GHz is the mixer's input frequency. This mixer was simulated using ADS 130 nm
RF-CMOS. The RF and LO BALUNS in this Gilbert cell mixer convert 120 GHz to 1
GHz. and obtains 68.664 dB RF-LO isolation, 8.882 CG and 8.4 db NF. The
suggested design has -12.2 dBm of 1-dB compression point in this RF bandwidth.
At 3 V bias, this double-balanced Gilbert cell mixture dissipates 66.8 mW. |
Keywords: |
6G, Gilbert Cell Mixer, Conversion Gain (CG), Linearity, 1dB compression point,
Noise Figure (NF), BALUN |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
FACTORS AFFECTING INTENTION TO USE OF ONLINE INVESTMENT PLATFORM DURING GLOBAL
RECESSION |
Author: |
NI MADE AULIA SINTA DEWI, TANTY OKTAVIA |
Abstract: |
A country may face a major downturn in economic activity due to a planned
recession in 2023, which could result in more unemployment and lower incomes, as
well as affect losses in the investment sector. However, this does not dampen
investors' interest in buying and selling investment products online. The
purpose of this study is to analyze what factors can influence investors'
interest in investing online before a recession occurs. In this study, there are
5 variables studied, including 1 dependent variable (endogenous) from investment
interest and 4 independent variables (exogenous) from factors such as investment
knowledge, perceived benefits, perceived ease of use, and the last, perception
of risk. The researchers then receive up to 170 responses from people who used
online investment media using original data derived directly from survey
findings sent to the target respondents. For data processing, it uses the
SmartPLS program, and the model used is the structural Equation Model (SEM).
Based on the results and conclusions of this study, the variables investment
knowledge, perceived benefits, and perceived ease of use have a significant
positive effect on investment interest, while the perception of risk variable
has no significant effect on investment interest in individuals before the
recession. |
Keywords: |
Investment Interest, Investment Knowledge, Perception of Benefits, Perceived
Ease of Use, and Risk Perception, Recession |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
QUALITATIVE RECOMMENDER SYSTEM USING ENTROPY-WEIGHTED PEDAGOGICAL CRITERIA FOR
EFFECTIVE TRAINING IN E-LEARNING PLATFORMS |
Author: |
RAJAE ZRIAA1, HICHAM SADIKI, MEROUANE ERTEL, SAID AMALI, NOUR-EDDINE EL FADDOULI |
Abstract: |
Recommender systems based on collaborative filtering have been widely used in
many online learning systems, in order to help learners, find appropriate
learning resources. However, these systems are based on classical similarity
measures exploiting only learners' ratings for learning objects said subjective
preferences, to form groups of learners with similar interests. This paper aims
at exploiting also pedagogical criteria of the learning objects, in order to
improve the classical similarity measures to generate qualitative
recommendations. For this reason, we adopt a Shannon Entropy approach, combining
heuristic weights with classical similarity measures, in order to produce
recommendations evaluated by their subjective quality, and by their objective
usefulness to support learners in their learning process. |
Keywords: |
Recommendation System; Machine Learning; Hybrid Filtering; Shannon Entropy;
Pearson Correlation; E-Learning; Qualitative Recommender System. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
AN ENSEMBLE MACHINE LEARNING MODEL FOR CLASSIFICATION OF CREDIT CARD FRADULENT
TRANSACTIONS |
Author: |
DR TINA ELIZABETH MATHEW |
Abstract: |
Digital payment systems such as bank credit cards, debit cards, wallets and
more, allow users to make payments anywhere anytime without much hassle and at
their convenience. On the other hand, digital payments like Credit card
transactions, are vulnerable to many security issues including banking frauds. A
credit card user will always prefer a highly reliable system that can detect and
prevent banking frauds. Hence techniques that provide better security to these
elements or during transactions, identification of genuine and fraudulent
transactions are crucial. Machine learning is a promising field of study that
can help deal with such critical problems of detection and classification of
fraudulent transactions. In this study, the suitability of various machine
learning classifiers is investigated in the detection of credit card frauds and,
an ensemble machine learning framework which constitutes of a majority voting
system implemented on selected classifiers is developed. The performance of the
model with feature selection – Pearson Correlation Coefficient and without
feature selection is also analyzed. To address the problem of heavy imbalance in
the dataset, two class balancing techniques such as Random Under Sampling and
Synthetic Minority Oversampling techniques are also applied. The results
demonstrate the appropriateness of applying Machine Learning techniques in
credit card fraud detection and classification. Digital payment systems such as
bank credit cards, debit cards, wallets and more, allow users to make payments
anywhere anytime without much hassle and at their convenience. On the other
hand, digital payments like Credit card transactions, are vulnerable to many
security issues including banking frauds. A credit card user will always prefer
a highly reliable system that can detect and prevent banking frauds. Hence
techniques that provide better security to these elements or during
transactions, identification of genuine and fraudulent transactions are crucial.
Machine learning is a promising field of study that can help deal with such
critical problems of detection and classification of fraudulent transactions. In
this study, the suitability of various machine learning classifiers is
investigated in the detection of credit card frauds and, an ensemble machine
learning framework which constitutes of a majority voting system implemented on
selected classifiers is developed. The performance of the model with feature
selection – Pearson Correlation Coefficient and without feature selection is
also analyzed. To address the problem of heavy imbalance in the dataset, two
class balancing techniques such as Random Under Sampling and Synthetic Minority
Oversampling techniques are also applied. The results demonstrate the
appropriateness of applying Machine Learning techniques in credit card fraud
detection and classification. |
Keywords: |
Classification, Credit Card Fraud, Ensemble Learning, Machine Learning (ML),
Synthetic Minority Over Sampling Technique (SMOTE), Random Under Sampling (RUS). |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
A SECURED BLOCKCHAIN BASED APPROACH FOR DECENTRALIZED AGRI-INSURANCE FOR FOOD
CROPS SUPPLY CHAIN |
Author: |
DAYANA D.S., KALPANA G |
Abstract: |
Food crop agri-insurance is a critical tool for protecting farmers from the
financial risks associated with crop failure or damage caused by natural
disasters. By using trustworthy, secured transparent ledgers, smart contracts
and decentralized platforms, the efficiency, security, and accessibility of food
crop agri-insurance for farmers can be prominently enhanced. By tracking food
crop provenance, blockchain aids in the establishment of reliable crop insurance
and confidence between farmers and insurance company. Traditional crop insurance
schemes are complicated and often unprofitable. Because of the lack of
confidence in insurance companies and a concern over deferred and any
non-payment of claims, farmers frequently express reluctance to have their crops
insured. This paper provides an outline for the potential benefits of using
blockchain technology in food crop agri-insurance and proposes the
implementation that can automate the claim process, reduce the time and work
needed to process claims and make the process faster and more efficient for
farmers. It can also increase transparency and reduce the risk of fraud, which
improves the overall security of the system. Moreover, the proposed system makes
use of smart contracts which are feasible, effective, and inexpensive food crop
insurance approach that guarantees farmers are protected and get prompt crop
insurance benefits. Also, decentralization and security in blockchain technology
ensure transparent transactions between stakeholders and safety in food crop
agri-insurance. According to our data, the length of time required for block
creation is directly correlated with the time of process. |
Keywords: |
Blockchain, Crop Insurance, Smart Contract, Distributed Ledger, Latency |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
DEEP LEARNING DETECTING FRAUD IN CREDIT CARD TRANSACTIONS |
Author: |
IMANE KARKABA, EL MEHDI ADNANI, MOHAMMED ERRITALI |
Abstract: |
Fraudulent acts -in the financial sector- cause dramatic losses to companies and
individuals. To tackle this conundrum, artificial intelligence trends forthcame
to develop a fraud detection system. This paper comes to process fraudulent
credit card transactions issue deploying ANN and CNN, two supervised deep
learning algorithms that proved efficiency .Yet, two hurdles appear: Constant
emersion of new fraudulent patterns and highly-imbalanced dataset. So, sampling
techniques are required to balance data, the thing that affects the system
performance. Thus, Autoencoder, as an unsupervised deep learning algorithm, was
added to compare it to the aforementioned algorithms. Three models were trained
on a dataset of 284,807 credit card transactions labeled as fraudulent and
legitimate. Various techniques were conducted in the pre-processing phase as
normalization, data balancing, and feature selection. In the during-applying
model stage, tuning and analysis were conducted on the model parameters to
improve the classification decision. Similarly, in the post-applying model
stage, a boosting technique was applied. Not only were the models compared in
terms of accuracy, precision, recall, and AUC score but also they were based on
confusion matrix. Eventually, one model was chosen out of the experimented
models based on the robustness of detecting new fraud patterns; especially, the
latter demonstrates optimal rates, achieving an f1-score of 93% after
classifying not fraud transactions. |
Keywords: |
Unsupervised Deep Learning, ANN, CNN, Autoencoder, Imbalanced Dataset |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
A HYBRID APPROACH FOR ENHANCING THE CLASSIFICATION OF THE PARKINSON’S DISEASE
USING SWARM OPTIMIZATION |
Author: |
MANAL A. ABDEL-FATTAH, RAHMA HUSSEIN EID, AHMED ELSAYED YAKOUB |
Abstract: |
The second most common neurodegenerative ailment after Alzheimer's disease is
Parkinson's disease. It always affects adults from the age of sixty and more.
Parkinson's disease symptoms are unnoticed in the early stages, but the symptoms
become more apparent as the disease progresses. Recent studies have shown that
the disease symptoms can appear in the form of vocal disturbances in the earlier
stages and can be used to diagnose Parkinson’s disease. This paper proposes an
approach for detecting Parkinson's disease (PD) using speech signals. The
motivation of this approach is to improve the detection of Parkinson's early
diagnosis by determining the most effective speech examinations instead of
produce a huge number of examinations to resistance to the disease at an early
stage. As feature selection and swarm algorithms play a vital role during
classification, this paper has proposed a hybrid approach based on the Emperor
Penguin Colony (EPC) swarm algorithm with Correlation-based Feature Selection
(CFC), which is called CEPC. A Parkinson's disease classification dataset
consisting of 756 voice measures was used in this study. Before using the
proposed approach, five classification algorithms were used to compare accuracy
results. Also, the Ensemble classifier has been used in this paper. The CEPC
proposed approach provides an improvement in the accuracy of results. An
accuracy of 89.4% is obtained by the ensemble classifier, which is higher than
some recent work. |
Keywords: |
Swarm Intelligence Algorithms; Feature Selection; Metaheuristic Algorithms;
Emperor Penguins Colony; Classification |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
DATA QUALITY ASSESSMENT USING TDQM FRAMEWORK: A CASE STUDY OF PT AID |
Author: |
TEDI WAHYUDI, SANI MUHAMMAD ISA |
Abstract: |
Data has a significant meaning for a business and is used to guide objectives
and decision-making. In order to produce high-quality data, a corporation must
be able to analyze and manage the data properly. This is a challenge for
companies, especially those with a wide variety of data sources, because of the
risk of increasing the inaccuracy of the data they have, which can result in
making inappropriate decisions. Data processing and acquisition activities are
carried out by AID, a company that specializes in "Data as A Service," spanning
twelve business units with different business lines that are managed by a
conglomerate group that mostly serves the financial services industry. The
current state of the organization presents many challenges, as data sources
still lack standardization and control, or monitoring of data completeness and
accuracy, and the organization has never measured the quality of existing data.
In order to obtain quality of data, it is necessary to apply specific methods,
processes and techniques, to measure the data. The approach taken in this study
to evaluate the quality of the data are Total Data Quality Management (TDQM) and
the six dimensions from the DAMA white paper. The results of this evaluation
procedure can be used to examine the company's existing data quality and to
provide recommendations for changes that need be made internally. The results
showed that the quality of data owned by the company was at the threshold of a
very high-quality level. Additionally, it is envisaged that this data quality
assessment can be applied to all business units and conducted on a regular
basis. |
Keywords: |
Data Quality, Total Data Quality Management, TDQM, data quality dimension, DAMA |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
CNN-MOBILENETV2- DEEP LEARNING-BASED ALZHEIMER’S DISEASE PREDICTION AND
CLASSIFICATION |
Author: |
AMBILI. A.V, DR. A.V. SENTHIL KUMAR, DR. ROHAYA LATIP |
Abstract: |
Alzheimers disease (AD) is a long-lasting brain disorder for which there is no
effective treatment. Yet early detection can delay he growth of the disease. Due
to the varied nature of medical tests, manual comparison, visualization and
analysis of data can be time-consuming as well as demanding. As a result, an
effective method for categorization of Magnetic Resonance Imaging (MRI) images
is helpful but extremely difficult. In this paper, the stages of AD are
identified using a unique method that effectively classifies brain MRI images
using label propagation by involving a Deep Learning (DL)-based framework.
Decreased brain tissue volume in brain lobes, hippocampus area, and thalamus are
the primary features that aid in differentiating an AD from a normal MRI. The
features should be efficient in distinguishing the characteristics between an
AD-affected brain and a normal one. A Particle swarm optimization (PSO) based
Speed-Up Robust Features (SURF) framework that embeds feature vectors in a
subspace to maximize utilization of features that were extracted is presented. A
classification method is employed in the newly generated space to categorize
data into three classes namely, Normal Condition (NC), MCI, and AD using
Convolution Neural Network (CNN)-MobileNetV2. The proposed scheme offers a
classification accuracy is 97% yielding a 3% reduced error rate when compared to
the best available approaches. |
Keywords: |
MCI, AD, CNN, MobileNetV2, PSO, SURF |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
ECG BASED MULTI MODAL FRAMEWORK FOR HEART DISEASE DETECTION |
Author: |
S.IRIN SHERLY, G.MATHIVANAN |
Abstract: |
Most severe diseases afflicting the human race is as heart disease, so early
diagnose and prediction of heart disease is necessary to save human life. Heart
disease requires early diagnosis and prognosis in order to save lives. Thus the
accurate prediction is done by using deep learning algorithms to avoid the
disadvantages in machine learning because which use separate algorithm for
feature selection to extract feature. As a result, Convolutional Neural Network
(CNN) combined with Aquila Optimization Algorithm (AOA), as Hybrid Deep
Convolutional Neural Network (DCNN), which is employed to detect the heart. The
AOA algorithm is used to select the weight parameters in the DCNN, which works
well on images. Electrocardiogram (ECG) images are used for predicting of
cardiovascular disease. The concept driving this study is to combine ECG images
as well as clinical data in order to give high performance in prediction. ECG
images are pre-processed to scale down the size and then CNN is applied and
prediction is done. In this case, different methods for pre processing are
applied and best pre-processing method is found in this work. ECG images are
taken and different mathematical methods like Fourier transform, DCT or
combination of Fourier transform and DCT etc are applied and best method is
found. The proposed model is then implemented in MATLAB, and its performance is
assessed by comparing it to other existing CNN models. |
Keywords: |
Deep Convolutional Neural Network, Aquila Optimization Algorithm, Heart Disease,
Electrocardiogram, Pre-processing |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
PREDICTING SQL QUERY QUALITY USING MACHINE LEARNING TECHNIQUES |
Author: |
MOHAMMED RADI |
Abstract: |
To achieve high database performance (e.g., high throughput and low latency), a
database tuning tech-nique is needed to make a database application run faster
and respond to end-users on time. Users rely heavily on Structure Query Language
(SQL) queries to manage and manipulate their data. The complexity of these
queries can range from very simple to very complex. Poorly constructed queries
usually lead to performance problems. However, the end-user does not know if the
SQL statement is poorly written result-ing in poor database system performance.
Therefore evaluating SQL queries can be difficult because there are many
syntactic structures for equivalent queries. Manual evaluation is far too
time-consuming since there are so many queries in question. Several papers have
provided hints and tips on writing good SQL queries to achieve better
performance. However, there is a lack of research on identifying poorly-written
SQL queries. Therefore, new approaches are needed to automatically identify
poorly-written SQL queries, which have to rewritten for faster performance. In
this paper, we propose a classification framework to automatically identify well
and poorly-written SQL queries. The proposed framework utilizes various ma-chine
learning algorithms including Decision Trees, k Nearest Neighbours, Support
Vector Machine, and Naive Bayes. In addition, we identified the key features
using two different feature extraction techniques namely TFIDF and Count
Vectorizer. To effectively evaluate the proposed framework, we used the Delphi
technique to manually label two different datasets namely (Bombay and ERPNext).
The experimental results demonstrate that the four machine learning classifiers
capable to classify the SQL queries into (well, accepted, and poorly) provide
promising results in terms of Recall, precision, and F1-score. In both
da-tasets, the Decision Trees classifier outperform other classifiers by
achieving (90%) on the Bombay Da-taset and (84%) on the ERPNext Dataset in term
of F1-measure. Furthermore, the Count Vectorizer out-performs the TFIDF in
predicting poorly written queries.Additionally, the proposed framework can serve
as a useful tool for database developers and SQL programmers for detecting
poorly written query, conse-quently utilized for optimizing SQL query
performance. |
Keywords: |
SQL Query, Machine Learning, Classification, Feature Selection, Database Systems |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
INTEGRATION OF THE MDA APPROACH IN DOCUMENT-ORIENTED NOSQL DATABASES, GENERATION
OF A PSM MODEL FROM A PIM MODEL |
Author: |
AZIZ SRAI, FATIMA GUEROUATE |
Abstract: |
The volume of data and its diversity are so important today that is the reason
why many relational databases are unable to handle this type and large volume of
data. To respond to this problem, many NoSQL databases have emerged, such as
document-oriented, graph-oriented, key-value-oriented and column-oriented
databases. These databases, which revolve around Big Data, have shown an
important power in the management of big data. The scientific contribution of
the work presented in this article is the application of an MDA approach on a
document-oriented approach. We demonstrate the capacity and adaptability of the
model-oriented approach on NoSQL approaches, in particular the document-oriented
approach. For the presentation of this work we started by introducing the
context of our studies which is the MDA approach and the document-oriented
approach, then we defined the different metamodels of the sources and targets.
We then introduced the different possible model-to-model and model-to-Text
transformations using the QVTo transformation language. Finally, we presented as
result the document-oriented PSM model and the XMI model generated from the
Model to Text transformation with Acceleo. Our motivation for this contribution
comes down to the fact that a minority of authors who integrate the concept of
programming by model. |
Keywords: |
Big Data, MDA approach, NoSQL, QVT, PIM model. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
AN INTELLIGENT SOFT COMPUTING FRAMEWORK FOR FORECASTING STOCK MOVEMENT DIRECTION
IN INTERNATIONAL STOCK MARKETS |
Author: |
K. VENKATESWARARAO AND B. , VENKATA RAMANA REDDY |
Abstract: |
Every investment in the stock market aims to maximize profit and minimize risk,
which is the key to a thriving economy. In today's economy, stock market data
prediction and analysis play a crucial role. The non-linear nature of stock
market data makes it challenging to analyze. Several studies have been published
recently that propose using soft computing techniques to predict stock trades.
Predicting consistent outperformance in the stock market has always been a
challenge because of the interaction of many factors. In this study, we present
an intelligent soft computing framework for forecasting stock movement direction
based on efficient feature optimization and hybrid detection technique. First,
we introduce an improved Ebola optimization (IEO) algorithm for data
preprocessing which filters the noise and irregular patterns from the gathered
data. As part of the recursive feature selection, the selection of underlying
model hyperparameters is critical. A Chaotic Farmland Fertility (CFF)algorithm
is suggested for the feature optimization process which effectively improves the
convergence speed for high-dimensional optimization problems. Next, a hybrid
Spiking-Quantum Neural Network (hybrid SQNN) framework is present to forecast
stock movement direction which ensures avoidance of false prediction rates.
Following that, we validate our framework using international benchmark
datasetslike the Australian stock market, U.S. stock market and China's wind
economy. The simulations have demonstrated the effectiveness of our framework
over existing frameworks in terms of error and quality metrics. |
Keywords: |
Soft Computing, International Stock Markets, Data Preprocessing, Feature
Optimization, Stock Movement Forecasting |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
DESIGN ADVENTURE ROLE PLAYING GAME WITH PROCEDURAL CONTENT GENERATION USING
PERLIN NOISE ALGORITHM |
Author: |
RAUL ANDRIAN, ANGGA ADITYA PERMANA, ADHI KUSNADI |
Abstract: |
Producing quality video game content is not easy. In the process of creating
game content, it will require a lot of resources, a long time and expensive
costs. This problem can be solved by applying the Procedural Content Generation
or PCG method to create game content using certain algorithms without having to
create it manually. In this study, the Perlin Noise algorithm was used to create
a map for the game that was built. Perlin Noise created realistic visuals and
sound effects in games. Besides that, it was very efficient and easy to use.
This research was made with the aim of designing and building an Adventure RPG
game using the Perlin Noise algorithm, as well as measuring the level of player
satisfaction with the game that has been made. The game engine used in the
process of making this game is the 2019 version of Unity Engine 3D with the C#
programming language on the windows platform. After the game creation process is
complete, it will be continued by testing using the Game User Experience
Satisfaction Scales 18 (GUESS-18) method which provides 18 questions to measure
the level of player satisfaction with games built with PCG. Based on the results
of the tests that have been carried out, the final result with a value of 81,85
% indicates that the game that has been successfully built is included in the
good predicate category. |
Keywords: |
Role Playing Game, Procedural Content Generation, Perlin Noise, Game User
Experience Satisfaction Scales |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
AN ENHANCED DYNAMIC ALGORITHM FOR ASSOCIATION RULES MINING |
Author: |
ALA IBRAHIM, MOHAMMAD SHKOUKANI |
Abstract: |
The data mining techniques have been applied to various fields. Frequent pattern
mining is one of the active research themes in data mining. It has an important
role in all data mining tasks such as clustering, classification, prediction,
and association analysis. A frequent pattern is the most time-consuming process
due to the massive number of patterns generated. Frequent patterns are generated
by using association rule mining algorithms that use candidate generation and
association rules such as the Apriori algorithm. Researchers showed an
extraordinary enthusiasm for information mining. Likewise, industrial companies
call for different information mining methods to better comprehend client
behavior, enhance the service provided and expand business revenues. Due to the
huge size of data that exists in databases and warehouses, also because these
data are big, dynamic, and change frequently. It is hard and costly to perform
mining for frequent patterns and association rules from scratch every time. This
paper explores three algorithms. The first one is the classic Apriori algorithm
which finds frequent patterns and association rules from scratch every time it
works; any update on data will enforce the algorithm to repeat the full process.
The second algorithm is the dynamic algorithm which finds frequent item sets by
using the accumulated knowledge stored in a database table. This method takes
less processing time and computations to find frequent patterns from the data
than the classic Apriori. The third algorithm is an enhanced algorithm that we
have developed. It includes an indexing technique that helps to discover
frequent patterns quickly. The three algorithms were tested on a dataset
containing 127936 transactions that contain the items of the market that have
been purchased with 207 different types of these market items. The results
initially showed that the classical Apriori algorithm was better than the
dynamic algorithm and an enhanced algorithm. However, our enhanced algorithm was
78.54% better than the classic Apriori and 27.49% than dynamic algorithm in
frequent runs. The results were 169.93, 49.53, and 37.18 seconds for the classic
Apriori, the dynamic, and the enhanced algorithms, respectively. |
Keywords: |
Big data, Data mining, Association rules, Indexing |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
Title: |
DETECTING USERS INTENT BY COMMUNITY QUESTION ANSWERING FOR INFORMATION RETRIEVAL |
Author: |
MEHDI GAOU, HICHAM TRIBAK, SALAH KRIT, SALMA GAOU |
Abstract: |
Information retrieval systems (IRs) have seen new types of tools called
Community Answering Questions (CQA). It is the taking into account of the need
for precise information of the user that motivated the emergence of such
systems. A Community Answering Questions (CQA) system can be opposed to an
In-ternet search engine like Google or Yahoo! Wiki Answers, Answers and
domain-specific forums like Stack Overflow in certain specific points. Although
the idea of receiving a direct and targeted response to an issue seems very
attractive, the quality of the question itself can have a significant effect on
the likelihood of obtaining useful responses. Such an information retrieval
paradigm is particularly appealing when the problem cannot be answered directly
by the search engines due to the unavailability of relevant online content. A
good understanding of the underlying purpose of an issue is essential to meet
the information needs of the user better. In this article, we analyze the
intent of each question in CQA, the research problem arising from the
previ-ously stated objective consists in estimating the best answer according to
a question, all its responses and the metadata attached to it. The CQA is
reducible to a classification problem, with the "best" answers as a particular
class, the rest as a negative class. We can obtain significant and significant
improvements in classification concerning state of the art in this field. In
addition to textual features, a variety of metadata features are used to model a
user's intent, which helps the CQA service better answer to similar questions —
recommending the most relevant respondents. |
Keywords: |
Community Question Answering, Question Retrieval, User Intent, Modeling Entry
Point Prdic-tion. |
Source: |
Journal of Theoretical and Applied Information Technology
15th May 2023 -- Vol. 101. No. 9-- 2023 |
Full
Text |
|
|
|