|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Information Technology
November 2024 | Vol. 102
No.22 |
Title: |
THE EFFECT OF APPLIED ZAKAT ON WEALTH DISTRIBUTION USING AGENT BASED MODELING |
Author: |
HADDAD ABDERRAHIM , EL MOSAID FADMA |
Abstract: |
Zakat is the third pillar of Islam, therefore a mandatory alms and one of the
most important tools of Islamic finance that ensures wealth redistribution and
is used to reduce poverty in Muslim societies. Infact, it is intended to provide
for the immediate needs of the eight categories of eligible recipients as they
are recognized in the Holy Quran in Surat Tawba verse 60. The capitalistic
system automatically generates a serious disparity between the rich and the poor
then try to reduce this gap indirectly through taxation whereas the Zakat
directly reaches the neediest and diminished category. In fact, the goal of the
Zakat is to transform the poor into Zakat givers by pulling them out of poverty.
This will be examined through simulation with agent based modeling commonly
known as ABM and will be applied using simulation tools NETLOGO Version 6.4p by
introducing the Islamic concept that is Zakat to the model of Simple Economy
previously developed by Wilensky and used by [1] and [2] to study the Impact of
Zakat on a Social Wealth Distribution. Many cases are simulated with and without
Zakat and by using the indicators like Gini index, Lorenz curve, we are able to
conclude that a wealth distribution is more equal and the gap between the rich
and the poor is less significant compared to when Zakat is not applied. |
Keywords: |
Zakat, Agent Based, Wealth Distribution, Netlogo, Islamic Finance |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
DEEP CNN BASED EMPIRICAL INVESTIGATIONS TO SKIN GRAZE UNCOVERING AND CATALOGUING
USING HYBRID FEATURES SELECTION |
Author: |
DR. SURESH BABU CHANDOLU, DR.P. VENU MADHAV, DR. KANNEBOINA ASHOK,
ANJANEYULU NAIK R, VENKATA NARAYANA T, G N.SOWJANYA, KURRA UPENDRA CHOWDARY |
Abstract: |
The proper disease diagnosis is one of the most important steps in medical
treatment. In terms of diagnosis, one of the most unstable and difficult
professions is dermatology. In order to make a correct diagnosis, dermatologists
regularly need more patients because skin lesions, a deadly disease, can affect
people of all ages. In order such as intelligent systems to diagnose skin cancer
early and more accurately, skin lesion detection and categorization are
essential. The term "multiclass skin lesions" refers to a group of subtypes of
skin lesions, including basal cell carcinoma (BCC), melanocytic nevus (NV),
melanoma (MEL), actinic keratosis (AK), benign keratosis lesion (BKL), squamous
cell carcinoma (SCC), dermatofibroma (DF), and vascular lesion (VASC). The
multi-class classifications are still a difficult task due to the wide range of
skin lesions and their high similarities. It requires a significant amount of
time, and expense to manually identify various skin lesions from dermoscopy
images. Therefore, it is crucial to develop automated diagnostics techniques
that can more accurately classify skin lesions of multiple classes. Hence this
study presents Deep Convolutional Neural Network (DCNN)-based hybrid feature
selection is used for multiclass skin lesion detection and classification. The
sensitivity, accuracy, and specificity of the architecture that is being
provided are used to evaluate its performance. |
Keywords: |
Dermatology, Skin cancer, Skin lesion, Hybrid features selection and DCNN |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
EMOTION ANALYSIS OF TEXTUAL CONTENTS USING NATURAL LANGUAGE PROCESSING AND TEXT
MINING |
Author: |
SUBHASH CHANDRA GUPTA, NOOPUR GOEL |
Abstract: |
Background: As the internet based e-commerce applications are expanded, a
tsunami of comments are arised in the digital world. Most of the comments are
textual in nature, and a reflection of emotions made by the people from their
day-to-day activities performed on these applications. Analysis of these
comments is not an easy task, a highly technological model based on machine
learning classifiers is required to perform this task. Methods: Model's
Performance is based on the classifiers, the higher the count of correct
predictions made by the classifier, the better performance the model shows. In
the proposed NLP based model, seven classifiers used in the model are KNN,
Logistic regression, SVM, multinomial Naive Bayes, decision tree, random forest
and Gradient Boosting. Input dataset is an example of multiclass, during the
preprocessing, noise was removed from the dataset and converted into a binary
class dataset. Model uses the term frequency-inverse document frequency (TF-IDF)
to vectorize it for feature extraction before applying classifiers to produce
its results. Result and Analysis: Finally a comparative study has been made
to analyze the obtained results of all classifiers used in the model. From the
analysis, it is found that the SVM classifier is the best scorer for the model
and it scores accuracy, F1 score, precision, recall and AUC 96.42%, 96.38%,
96.42%, 96.15% and 96.35 % respectively. |
Keywords: |
Sentiment Analysis, Imbalanced dataset, NLP, Text Mining, Text summarization. |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
OPTIMIZED ANN HYPERPARAMETERS TO IDENTIFY MALICIOUS TRAFFIC IN NETWORKS |
Author: |
HIND KHOULIMI, OTHMAN BENAMMAR |
Abstract: |
Intrusion detection systems have been a critical research area for over three
decades. With the growth of internet traffic, the number of attacks that violate
the confidentiality, integrity, and authenticity of important data has increased
significantly. The advent of Artificial Intelligence (AI) especially Deep
Learning (DL) creates models automatically to detect malicious traffic without
human intervention. In this context, we have proposed an Intelligent Security
Management System (ISMS) based on detection, analysis, and action engines. To
guarantee efficiency and accuracy, we have used Artificial Neural Network (ANN)
as a classification model, and to achieve better accuracy, different
optimization algorithms are applied to select the best hyperparameters (weights
and biases) for our model. In this present paper, we have based our study on KDD
CuP 99, NSL KDD, and UNSW-NB 15 dataset to evaluate the best combination of ANN
and optimization algorithms, using metrics: accuracy, loss, training time,
precision, recall, and F1-score. |
Keywords: |
Artificial Neural Network, Harris Hawks Optimization, Particle Swarm
Optimization, Spider Monkey Optimization; Cat Swarm Optimization. |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
A SYSTEM FOR ANALYSING CALL DROP DYNAMICS IN THE TELECOM INDUSTRY USING MACHINE
LEARNING AND FEATURE SELECTION |
Author: |
CHEEPU BALAKRISHNA, DR CINDHE RAMESH , S MEGHANA, DR. C. DASTAGIRAIAH |
Abstract: |
Quality of service in the telecom industry plays a vital role in the growth and
economy of the country. In India, several telecom operators provide services,
and there is a regulatory authority known as the Telecom Regulatory Authority of
India (TRAI). In the telecom domain, call dropping is a problem that
deteriorates the performance of the telecom industry in rendering services. It
also causes inconvenience and waste of time for users, as well as reducing the
level of user satisfaction. There is a need for a technology-driven analysis
with the help of Artificial Intelligence (AI) to analyze call drop dynamics
toward making well-informed decisions. The existing research revealed that
Machine Learning (ML) helps analyze call drop dynamics. However, there is a need
for a framework with machine learning techniques and optimizations to improve
performance in analyzing call drop dynamics in the telecom industry. In this
paper, we proposed an ML framework for automatic analysis of all drops in the
telecom industry across all operators. The framework also supports optimizations
like future engineering and dimensionality reduction to improve the performance
of machine learning models. We proposed an algorithm called Learning based Call
Drop Analytics (LbCDA) which exploits feature selection and training multiple
classifiers towards call drop analytics. With benchmark dataset variants of the
telecom industry, our empirical study has revealed that our framework's Random
Forest (RF) model outperforms other models with the highest accuracy of 87.40%. |
Keywords: |
Telecom Industry, Call Drop Analysis, Random Forest, Machine Learning,
Artificial Intelligence |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
AN EXTENSIVE REVIEW OF SECURITY ISSUES AND CHALLENGES IN FOG COMPUTING
ENVIRONMENT |
Author: |
W. ALGESHARI, M. SHER. RAMAZAN, F. ALOTAIBI, K. ALYOUBI |
Abstract: |
As a crucial addition to cloud computing, fog computing provides localized
processing resources and services near end devices at the network edge. The
drawbacks of traditional cloud computing, like excessive latency, immobility,
and poor location awareness, are addressed by this design, which is especially
advantageous for the Internet of Things (IoT). Despite its advantages, fog
computing poses serious security risks because of its decentralized
architecture, close proximity to end users, and confined processing power. These
security issues become more serious as IoT device counts rise, increasing the
possibility of cyberattacks. This review article offers an extensive overview of
the security issues and challenges unique to fog computing systems. It outlines
various attack vectors, examines the design, traits, and vulnerabilities of fog
nodes, and addresses mitigating these risks with countermeasures. This paper
attempts to identify important security issues in fog computing and suggest
future strategies for resolving these issues to ensure safe and reliable fog-IoT
ecosystems by assessing previous research. |
Keywords: |
IOT, Cloud Computing, FOG Computing, FOG Attacks |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
APPLICATION OF THE SIMPLE ADDITIVE WEIGHTING METHOD IN THE PERFORMANCE
ASSESSMENT OF ENERGY COMPANIES BASED ON ROA AND DAR |
Author: |
ENI DUWITA SIGALINGGING, ERLINA, ISKANDAR MUDA, DIO AGUNG HERUBAWA |
Abstract: |
Performance evaluation of energy companies in Indonesia is very important to
support efficient and sustainable energy sector management. One way to evaluate
company performance is to use the Simple Additive Weighting (SAW) method. This
method combines several criteria in one comprehensive assessment system. This
study is to test the use of the SAW method in assessing the performance of
energy companies in Indonesia. The focus is on two main indicators: Return on
Assets (ROA) and Debt to Asset Ratio (DAR). Data for this study were taken from
energy companies listed on the Indonesia Stock Exchange, which were obtained
from the company's financial statements. The SAW method is used by giving
weights to the criteria, normalizing the data, and calculating the ranking based
on the normalized decision matrix. The research is expected to help a more
efficient and accurate decision support system in assessing the performance of
energy companies. In addition, it is also expected to support government
policies in managing the energy sector sustainably. It is hoped that using the
SAW method can make the evaluation process faster, more efficient, and more
accurate than using the manual method that is usually used today. |
Keywords: |
Debt to Asset Ratio, Performance assessment, Return on Assets, Simple Additive
Weighting. |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
OPTIMIZING CROP YIELD PREDICTION CROP YIELD PREDICTION: A HYBRID APPROACH
INTEGRATING CNN AND LSTM NETWORKS |
Author: |
HUI HUI WANG, YIN CHAI WANG, BUI LIN WEE, JANE KHOO YAN, FARASHAZILLAH BINTI
YAHYA |
Abstract: |
Artificial Intelligence (AI) has proven successful in revolutionizing the
agricultural sector, facilitating advancements in prediction, decision-making,
and the monitoring and analysis of crops and soil. In this study, a hybrid model
is introduced with the capability to predict crop yield. The proposed learning
model combines the strengths of Convolutional Neural Network (CNN) with
Recurrent Neural Network (RNN) models. CNN, recognized for its superior
performance in feature extraction, is selected for its characteristic of
considering a smaller number of parameters in the network, thereby reducing the
risk of overfitting. Simultaneously, RNN serves as the prediction model,
capitalizing on its inherent learning nature, feedback network, and ability to
encode temporal sequence information. Addressing the short-term memory behaviour
of RNN, the network is enhanced with LSTM cells, enabling effective long-term
memory tasks. LSTM introduces memory blocks to resolve the exploding and
vanishing gradient problem, differentiating itself from conventional RNN units.
The best environment parameters have been identified by using the correlation
where it shows the parameter that have the most significant relation with the
crop production. The A Hybrid Approach Integrating CNN and LSTM Networks has
achieved 74% accuracy in crop yield prediction. |
Keywords: |
Agriculture, Convolutional Neural Network, Crop Yield Prediction, Machine
Learning, Recurrent Neural Network (RNN) |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
A SCALABLE AND SECURE BLOCKCHAIN-BASED HEALTHCARE SYSTEM: OPTIMIZING
PERFORMANCE, SECURITY, AND PRIVACY WITH ADAPTIVE TECHNOLOGIES |
Author: |
P. VINAYASREE, A. MALLIKARJUNA REDDY |
Abstract: |
This Paper Presents A Scalable, Secure Blockchain-Based Healthcare System
Architecture That Efficiently Manages Extensive Patient Data While Ensuring High
Security. Adaptive Partitioned Filters (APFS) and Compact Patricia Tries (CPTS)
Enable Efficient Data Access and Management, While Sharded Byzantine Optimized
Consensus (SBOC) and Go's Concurrency Model Facilitate Parallel Transaction
Processing. Security is Provided Through Bloom Filters, Patricia Tries Extended
By Merkle Trees, and an Immutable Blockchain Ledger Protected by Practical
Byzantine Fault Tolerance (PBFT). Verifiable Random Functions (VRF) Secure
Participant Selection for Consensus, and Zero-Knowledge Proofs (ZK-SNARKS)
Verify Transactions without Revealing Sensitive Information, Aligning With
Healthcare Regulations. Chacha20 Encrypts Sensitive Data, and Role-Based Access
Control (RBAC) Governs Access Rights. This Architecture offers a Comprehensive
Solution for Scalable, Efficient, and Secure Healthcare Data Management in
Blockchain Environments. |
Keywords: |
Blockchain, Adaptive Partitioned Filters (APFs), Compact Patricia Tries
(CPTs),Sharded Byzantine Optimized Consensus (SBOC), Zero-Knowledge Proofs
(zk-SNARKs), ChaCha20 encryption, Role-Based Access Control (RBAC), Verifiable
Random Functions (VRF). |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
EVOLVING INTERESTS AND PREFERENCES USING ARIMA AND STL IN SOCIAL MEDIA SEARCH
BEHAVIORS |
Author: |
B. RAMESH BABU, S. RAMAKRISHNA |
Abstract: |
This study investigates the nuances of user search behavior on social media
platforms by employing sophisticated time series algorithms: the Autoregressive
Integrated Moving Average (ARIMA) and Seasonal Decomposition of Time Series
(STL). In an era where social media constitutes a vital conduit for information
dissemination and interaction, understanding the patterns of user searches and
engagement becomes crucial. This research aims to dissect the multifaceted
nature of search behavior, focusing on trends, user engagement, and the
popularity of content. Through the integration of ARIMA and STL models, we
meticulously analyze the temporal aspects of these behaviors, unveiling the
progression of user interests and preferences. The research methodology
encompasses a comprehensive approach to data collection and preprocessing,
followed by the application of these advanced algorithms to model and scrutinize
search patterns effectively. Additionally, the study examines the influence of
external factors on search behavior and evaluates algorithms for content
recommendation, aiming to optimize content delivery and bolster user engagement
on social media platforms. Notably, the proposed method of combining ARIMA with
STL has demonstrated a significant improvement in predictive accuracy,
surpassing traditional models. Specifically, the ARIMA+STL model showed
remarkable enhancements, reducing the Mean Absolute Error (MAE) by approximately
67% and the Root Mean Squared Error (RMSE) by nearly 81% compared to baseline
models. Moreover, the Akaike Information Criterion (AIC) was notably lower,
indicating a superior model fit with optimized complexity. These findings
underscore the effectiveness of our approach in capturing the intricate dynamics
of social media search behavior, offering valuable insights for developing more
refined digital engagement strategies. |
Keywords: |
Social Media Analytics, ARIMA Model, STL Decomposition, User Engagement, Search
Behavior Patterns |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
THE DETERMINANTS OF SOFTWARE DEVELOPERS’ INTENTION TO ADOPT CHAT GPT AS
PROGRAMMING INFORMATION MEDIA |
Author: |
IFVO DEKY WIRAWAN, TANTY OKTAVIA |
Abstract: |
In the digital era, technology is developing rapidly. One of them in the field
of artificial intelligence is chatbots. A chatbot is a computer program designed
to stimulate conversation or interactive communication with (human) users via
text, voice, or visuals. One of the chatbots that is currently popular is
ChatGPT. ChatGPT training models interact in a conversational manner with a
dialogue format that allows ChatGPT to answer follow-up questions, admit
mistakes, violate false premises, and reject inappropriate requests. Software
Developers are one of the users who can take advantage of this ChatGPT
technology tool. There are several problems that software developers often
experience in the coding process, including code or algorithms that are too
complicated, making the code difficult for other people to understand and
problems in the code that are sometimes difficult to overcome. The purpose of
this research is to analyze what factors influence software developers'
intentions and adoption of using ChatGPT as an information media in programming.
This research model is UTAUT. This study involved surveying 399 participants
through the distribution of questionnaires. The data collected was then
processed and analyzed utilizing the PLS-SEM method. The findings revealed that
factors such as Performance Expectancy, Effort Expectancy, Trust, Perceived
Risk, and Experience significantly influenced Behavioral Intention. Furthermore,
Behavioral Intention was identified as having a significant impact on the
Intention to Adopt. Meanwhile, Social Influence and Facilitating Conditions do
not have a significant effect on Behavioral Intention. |
Keywords: |
Artificial Intelligence, Chatbot, ChatGPT, SMART PLS, UTAUT |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
THE ROLE OF CITIZEN ENGAGEMENT IN DEMOCRATIC GOVERNANCE ENHANCEMENT THROUGH
E-GOVERNANCE: A CASE STUDY OF LUSAKA CITY COUNCIL, ZAMBIA |
Author: |
LAMECK NSAMA, CHIMEKO KENNY WEBSTER, BIBHUTI BHUSAN DASH, NGULA WALUBITA, UTPAL
CHANDRA DE, SUDHIR KUMAR BEHERA, SATYENDR SINGH, SUDHANSU SHEKHAR PATRA |
Abstract: |
The study investigates how e-governance could be used to foster the engagement
of citizens in the process of democratic local governance with an emphasis on
the Lusaka City Council. Democratic governance involves principles of openness,
responsibility, and involvement. The introduction of e-governance brings new
opportunities for improving indirect communication between the government and
the citizens. This paper seeks to assess the extent of the impact of executed
e-governance projects in Lusaka City Council on democratic politics and results.
The data source is also diverse based on the view that both qualitative and
quantitative data will be used in the study. Interviews with local inhabitants
and councilors are conducted to educate the authors about their experiences with
and revelations about e-governance. Furthermore, a content audit and an analysis
of the documents and e-governance applications existing in LCC is carried out to
examine the ways of its accessibility and utility as well as its effects. From
research carried out on the topic, it can be deduced that e-governance has
enhanced citizens’ engagement in local governance. Availability and use of
information through digital platforms have boosted politically responsive
governance output in that it has raised the standards of accountability. Also,
other forms of communication that have come with the use of the Internet include
discussion forums and avenues of giving feedback online has allowed citizens to
give more opinions on any issue or decision-making process. The study also
outlines some barriers including; digital literacy and digital infrastructure as
factors that influence the success of e-governance programs in Zambia. Thus, the
research indicates that while e-governance has the capability to enhance
democratic accountability at the grassroots, its effectiveness remains
contingent on the existing challenges’ eradication and enabling everyone to
engage. The case of the Lusaka City Council teaches other municipalities noble
lessons on the prospects of e-governance regarding citizens’ engagement and the
fortunes of democracy. |
Keywords: |
Lusaka, Citizen Engagement, E-governance, Public Service, Democratic
e-governance, Participatory governance |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
INSIGHTS INTO METADATA COMPONENTS: A SYSTEMATIC REVIEW OF ENTERPRISE DATA
CATALOGS |
Author: |
ERIC WIJAYA, TANTY OKTAVIA |
Abstract: |
In the contemporary data-centric business environment, proficient metadata and
data documentation processes are essential for organizations aiming to maximize
the value of their data assets. This data-driven architecture, when coupled with
Business Intelligence (BI) tools, promotes data democratization, allowing
stakeholders throughout the business to utilize it. This research examines the
exploration of metadata elements on the development of BI systems. A thorough
literature research and a preliminary analysis are done to comprehend the
landscape of metadata classification. This study delves into existing research
on metadata and data catalogue management in enterprises, utilizing a systematic
literature review (SLR) to identify specific metadata components. The SLR
results not only describe the functions of each metadata component, but also
provide practical guidance on how to adopt them, making them helpful insights
for firms wishing to enhance their data management. Along with the bibliometric
study investigates trends and partnerships among metadata, providing further
information on efficient metadata implementation. These findings have
implications for firms looking to improve their data processes and achieve a
competitive advantage by providing new insights into management tactics, opening
the way for future research in metadata and data catalogue systems. |
Keywords: |
Metadata, Systematic Literature Review, Bibliometric, Catalogue System |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
AN INTELLIGENT MODEL FOR INTEGRATING ENTERPRISE DATA WAREHOUSE LAYERS TO MANAGE
SCHEMA EVOLUTION |
Author: |
WADEA GEORGE , MOHAMMED MARIE , AHMED YAKOUB |
Abstract: |
In contemporary organizations, data warehousing centralizes and manages data
from diverse sources, such as relational databases and semi-structured systems
like Temenos core banking (T24) XML systems, facilitating data analytics and
informed decision-making. However, the dynamic nature of data and evolving
source schemas pose challenges in adapting data warehouses efficiently, leading
to interruptions in data loading processes. This research proposes an
intelligent model to enhance data integration in data warehouses, aiming to
automate or semi-automate the development cycle and integrate various layers,
including business analysis, data modeling, ETL (Extract, Transform, Load), and
data quality enhancement through rejection handling. Leveraging metadata, data
dictionaries, data mining techniques, and Data Vault modeling, the model aims to
reduce development time and costs. The proposed model offers an efficient
solution to adapt to changes, significantly reducing adaptation costs compared
to prior approaches. By seamlessly integrating all layers of the data warehouse
(DWH), it streamlines development cycles through automation or semi-automation,
introducing additional features to expedite the process. This research
demonstrates the model's superiority through three implemented experiments,
showing significant time savings and cost reductions (75% decrease compared to
manual processes). It successfully integrates development layers, semi-automates
the development process, filters rejected data, and provides a clear vision of
schema storage during new changes deployment. |
Keywords: |
ETL; DWH; Data modeling; Schema evolution; Information Retrieval; Metadata. |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
FREQUENCY STABILIZATION WITH SOLAR PV INTEGRATION IN TWO-AREA INTERCONNECTED
MICRO GRID SYSTEM |
Author: |
ENDOORI RANI, Dr. K. NAGA SUJATHA |
Abstract: |
Microgrids (MGs) are becoming more reliant on Renewable Energy Sources (RESs) to
meet consumer demand but the fluctuating output of RESs, combined with the
unpredictable behaviour of loads can contribute to frequency instability. The
incorporation of additional renewable energy sources enhances the system's
inertia, contributing to frequency stabilization. This work examines the
frequency regulation (FR) in a microgrid system with two-area control,
incorporating a variety of energy sources such as electric vehicles (EVs), fuel
cells, wind turbines, energy storage systems, conventional generators, and solar
PV systems. A solar photo voltaic (PV) system, equipped with an inverter was
integrated into the existing microgrid for evaluation of microgrid’s performance
and analysed its operation with and without the PV system, specifically
examining deviations in frequency and power. Classical controllers, including
PI, PID, and I, are implemented in Area-1 and Area-2 of the microgrid to enhance
frequency and power stability. Magnitude of frequency and power fluctuations
were tabulated to compare the performance in Area-1 and Area-2 of Micro Grid
with photo voltaic (PV) and without PV systems. |
Keywords: |
Solar PV System, Frequency Regulation, Storage System, Distributed Energy
Systems, Renewable Energy Sources, Single Area System, Two-Area Interconnected
Syste |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
LSTM NEURAL NETWORKS WITH BAYESIAN OPTIMIZATION FOR REORDER SIMULATION IN RETAIL
INVENTORY MANAGEMENT |
Author: |
INTAN RAHMATILLAH, IMAN SUDIRMAN, ANTON MULYONO AZIS, IVAN DIRYANA SUDIRMAN |
Abstract: |
This research tries to improve the accuracy of demand forecasting in grocery
store with optimization of LSTM neural networks by using Bayesian Optimization.
Research Design Data and Methodology: This study aim to forecast the demand by
using Brand AB instant noodles daily sales data from a grocery store in one year
period. Well-tuned by Bayesian optimization, LSTM model can encode intricate
temporal structure and long-term dependency in sales sequences. Result: The
results show that the optimized LSTM model has a low value of mean squared error
(MSE) 0.0056, which means good predictability, a simple reorder simulation
incorporates the enhanced forecast model to refine optimal inventory management
decisions. Through this simulation, critical reorder points will be established
to ensure that the product is always available and there is no risk of stock
outs or overstocked items. Conclusion: The results in this study demonstrate a
great potential of sophisticated neural networks and the Bayesian optimization
approach to achieve better inventory management within retail industry. |
Keywords: |
LSTM Neural Networks, Bayesian Optimization, Inventory Management, Reorder
Simulation |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
OPTIMIZED EFFICIENTNET WITH GENETIC EXPRESS PROCESSING FOR ACCURATE LUNG DISEASE
CLASSIFICATION |
Author: |
YELLEPEDDI SAMBA SIVA KRISHNA ASSISH AND KUPPUSAMY P |
Abstract: |
The ongoing COVID-19 pandemic underscores the urgency for rapid and precise
diagnostic tools. This study presents an innovative approach for classifying
lung diseases COVID-19, asthma, and pneumothorax using Computed Tomography (CT)
lung images. The EfficientNet B4 is proposed to classify lung diseases
accurately using compound scaling features including depth, width, and
resolution. By integrating the EfficientNet model with a Genetic Express
Processing Algorithm (GEP) for hyperparameter tuning, the proposed method
focuses on optimizing dropout, learning rate, and batch size. Fine-tuning the
EfficientNet B4 model through compound scaling and hyperparameter optimization
led to a classification accuracy of 96.5%. Visualizing lung-infected regions
using Class Activation Maps (CAMs) provides insights into classification
decisions. This research work incorporates Generative Adversarial Networks
(GANs) to generate synthetic images that enhances data diversity and model
generalization. This method combines Deep Learning (DL) models with Genetic
Algorithms (GA) and GANs, demonstrating substantial improvements in disease
detection accuracy. The proposed approach offers medical professionals efficient
diagnostic tools for early and reliable disease diagnosis. Code can be available
at
https://github.com/YellepeddiSambaSivaKrishnaAssish/Optimized-EfficientNet-using-GEP-for-Lung-diseases.git. |
Keywords: |
Computed Tomography, COVID-19, Genetic Express Processing, Artificial
Intelligence, Optimization, Deep Learning |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
INTERACTIVE LEARNING PLATFORMS TO STEAM (I-LPS) GAMIFICATION FOSTERING
COMPUTATIONAL INNOVATORS AND CREATIVE THINKING IN TEACHER EDUCATION |
Author: |
PONGSATON PALEE, JITTIMA PANYAPISIT, ADIREK YAOWONG, THIPWIMOL WANGKAEWHIRAN,
PANITA WANNAPIROON |
Abstract: |
The purpose was to develop and study aims to analyze, synthesize, design and
develop interactive learning platforms with STEAM (I-LPS) gamification fostering
computational innovators and creative thinking in teacher education. The
research instruments included a manual for I-LPS, active learning lesson plans,
an active citizenship competencies test, and a satisfaction questionnaire.
Statistics for data analysis were percentage, mean, standard deviation, and
dependent sample t-test. The research is an application of the concept of
Research and Development (R&D) and defines the framework for conducting research
into 5 phases: Phase 1 (R1) Study and synthesis of the conceptual framework for
the development of interactive learning innovative media with a simplified
active learning model. Phase 2 (D1) Conceptual Framework for Developing
Interactive Learning Materials with Active Learning Model Phase 3 (R2)
Evaluation of the Conceptual Framework of the Gamified Digital Learning
Ecosystem by Asking for Expert Opinions Phase 4 (D2) Creating Interactive
Learning Materials with Gamified Active Learning Model to Promote Innovative
Skills Phase 5 (R3) Evaluation of Achievement and Satisfaction. The overall
student satisfaction with the interactive learning materials remains was high
(mean = 4.29, S.D. = 0.65) and indicating that the results of the measurement
before studying and the learning achievement after studying with the normal
teaching method were statistically significantly different at the .01 level. |
Keywords: |
Interactive Learning Platform, STEAM, Computational Innovators, Creative
thinking |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
RECURRENT NEURAL NETWORK-BASED BRAIN TUMOR CLASSIFICATION USING 3D MAGNETIC
RESONANCE IMAGING |
Author: |
Dr. S. KOTHER MOHIDEEN, K. A. MOHAMED RIYAZUDEEN |
Abstract: |
Brain tumor identification and segmentation using magnetic resonance imaging
(MRI) represent difficult but critical tasks for a variety of applications in
the area of medical analysis, including cancer detection and treatment. Brain
tumor implies the collection of aberrant cells in certain brain tissues. The
brain tumor may be malignant or cancer-free. Glioma, Meningioma are the most
frequent kinds of brain tumors. Early identification of tumor cells plays an
important role in patient therapy and recovery. Brain tumor diagnosis is
typically subject to a highly complex and time-consuming procedure. MRI scans of
different patients may be utilized to identify cancers at various stages. There
are many kinds of functional extraction and classification techniques used to
identify brain tumors from 3D MRI brain images. Specifically, in this article,
we offer a preprocessing method that works just on a small portion of the image
rather than the whole image to produce a flexible and efficient brain tumor
segmentation system. This technique reduces the amount of computation time
required and eliminates the overfitting issues that may occur in a deep-learning
model. In the second phase, we suggest a simple and effective Recurrent Neural
Network (RNN) while we are interacting with a smaller portion of brain images in
each slice. The Recurrent neural network algorithm exploits combine local and
global characteristics and it does so in two distinct ways. Additionally, it
enhances the accuracy of brain tumor segmentation when compared to current
state-of-the-art models. The image classification method for the RNN enables the
early detection of the tumor with great accuracy. We presented a recurring
neural network design for tumor cell identification, which is about 99.92%
accurate. A recurring neural network (RNN) is a kind of artificial neural
network where connections between nodes create a graph in a time sequence. |
Keywords: |
Magnetic Resonance Image, Malignant, Glioma, Meningioma, Recurrent Neural
Network |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
EXPLORING MACHINE LEARNING METHODS FOR INTRUSION DETECTION SYSTEM: A DEEP DIVE
INTO TECHNIQUES, DATASETS, AND PERSISTENT CHALLENGES |
Author: |
SHAIK JOHNY BASHA, D. VEERAIAH, SUMALATHA LINGAMGUNTA |
Abstract: |
Intrusion Detection Systems (IDS) play a crucial role in safeguarding modern
digital infrastructures by identifying potential threats and anomalies in real
time. As cyberattacks become more sophisticated, leveraging Machine Learning
(ML) techniques in IDS has emerged as a promising approach to enhance detection
accuracy, adaptability, and resilience. This paper provides an in-depth
exploration of various ML methods applied to IDS, categorizing techniques such
as supervised, unsupervised, and reinforcement learning. Additionally, it delves
into the most used datasets for training and evaluating IDS models, highlighting
their characteristics, advantages, and limitations. Furthermore, the paper
addresses the persistent challenges in deploying ML-driven IDS, including issues
related to data imbalance, real-time performance, adversarial attacks, and model
generalization. Through a comprehensive analysis of current research and future
directions, this survey aims to offer insights into the evolving landscape of
ML-based IDS, paving the way for more robust and scalable solutions in the face
of ever-evolving cyber threats. |
Keywords: |
IDS Dataset, Cyber security, Machine Learning, Intrusion Detection System (IDS),
Network Security |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
MULTI-CLASS PLANT DISEASE CLASSIFICATION AND STAGE-WISE SEVERITY PREDICTION
BASED ON MULI-CLASS LEAF LABELLING |
Author: |
D.LITA PANSY, M.MURALI |
Abstract: |
Plant Disease Severity Prediction (PDSP) aids in increasing the yield of a
plant. However, research has been done scarcely on the severity prediction of
multi-class Plant Disease (PD). Hence, this paper proposes multi-class leaf
labeling and Exponential Pareteo Fuzzy (EP-Fuzzy) based disease severity
prediction. Primarily, input images with the complex background are taken and
the background is removed in those images. The obtained background-removed
images are utilized for increasing the dataset via augmentation and providing
labeled data for the leaf labeling model utilizing augmented images. The
unlabelled images are labelled in the Spearman-based Pseudo Labelling (S-PL)
model by utilizing the augmented images. The labelled images and saliency-mapped
images are fused to enhance the Segmentation Accuracy (SA) of Seam
Carving-Region Split and Merge (SC-RSM). Afterward, the Depth-wise ResNet-50
(DRN) classified the class of the leaf diseases, with the segmented image and
extracted feature. By utilizing the EP-Fuzzy model, the severity stage is
predicted for the classified diseased leaf. The proposed technique’s performance
is experimentally assessed, where the proposed one exhibited superior
performance on accuracy, training time, overall prediction rate, Dice Similarity
Coefficient (DSC), et cetera. |
Keywords: |
Plant Disease Severity Prediction (PDSP), Multi-class Leaf label, saliency
maping, Spearman-based Pseudo labelling (S-PL), Depth-wise ResNet-50 (DRN). |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
DATA-DRIVEN INSIGHTS INTO FISHING PATTERNS USING VMS AND MACHINE LARNING |
Author: |
KANDA MAHENDRA, TANTY OKTAVIA |
Abstract: |
The fishing industry plays a crucial role in global food security, yet continues
to face significant monitoring and regulatory challenges. One of the most
pressing issues is accurately tracking fishing vessel behavior, especially given
the rising concerns of illegal, unreported, and unregulated (IUU) fishing that
threatens marine ecosystem sustainability. Current monitoring systems often
struggle to reliably distinguish between legitimate fishing operations and
suspicious activities. To address this challenge, our study introduces an
innovative approach combining Vessel Monitoring System (VMS) data with a Hidden
Markov Model (HMM) to track and analyze vessel movements. This method focuses on
vessel speed patterns to identify different fishing activities including
hauling, traveling, and active fishing. To strengthen the accuracy of our
analysis, we enhanced the HMM approach by incorporating three complementary
machine learning techniques: Naive Bayes, Support Vector Machine (SVM), and
Gradient Boosting Machine (GBM). This combined approach allows us to better
understand and classify various fishing activities by examining speed patterns
and movement transitions. Our results demonstrate significant improvements in
detecting and classifying fishing activities, particularly in distinguishing
between different phases of fishing operations and identifying unusual patterns
that might indicate illegal activities. The study concludes that this integrated
approach substantially improves our ability to monitor fishing activities, with
notably higher accuracy rates in classification. These findings offer promising
implications for fisheries management, providing a practical and effective way
to monitor fishing activities and promote sustainable practices. Our framework
offers a flexible and powerful tool for fisheries regulators, helping them
better protect marine resources through improved surveillance and monitoring
capabilities. |
Keywords: |
Vessel Monitoring System (VMS), Hidden Markov Models (HMM), Naive Bayes,
Gradient Boosting Machine (GBM), Support Vector Machine (SVM), fishing trip
behavior, machine learning, fisheries management, environmental factors, marine
resources management. |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
ENHANCING PATIENT MONITORING IN WIRELESS BODY AREA NETWORK THROUGH
SMA-INTEGRATED CONVOLUTIONAL NEURAL NETWORK |
Author: |
MD. TASLIM AREFINN, MD ABUL KALAM AZAD |
Abstract: |
Sophisticated systems of surveillance that keep tabs on life’s essential
functions are called health maintenance technologies. The goal of the endeavor
was to plan and construct wirelessly bodily networks of sensors for real-time
performance assessment. WBANs need to analyze massive volumes of data for the
purpose of making practical judgments during emergencies. In order to overcome
these problems, this study presents a deep learning structure for evaluating
health consequences called the Slime Mould Algorithm (SMA). In the beginning,
information from medical records for patients is gathered by the WBAN networks
in order to produce specific measurements for the assessment. WBAN modules
communicate with the destination node by sending information based on the
collected indicators. In this scenario, the optimal cluster head is determined
using the Fruit Fly technique. The combined Fruit fly Procedure’s results are
then sent to the destination component, whereupon the Convolutional Neural
Network, also known as the classifies the medical data in order to assess risk.
In this instance, the CNN is trained using the recommended SMA. With scores of
94.604% and 0.145, along with 0.058 for accuracy, power, and productivity,
correspondingly, the suggested SMA outperforms the other methods. |
Keywords: |
Patient Monitoring System, Convolutional Neural Network, Slime-Mould Approach,
Wireless Body Area Network, Fruit-Fly |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
ADVANCING CROP YIELD PREDICTION THROUGH MACHINE AND DEEP LEARNING FOR NEXT-GEN
FARMING |
Author: |
UDAYA KUMAR ADDANKI , TEJASWI MADDINENI , VIJAY DHAWALE , M.L.M.PRASAD , DESIDI
NARSIMHA REDDY , JEEVAN JALA |
Abstract: |
Agriculture has contributed to India's GDP, accounting for 15-18% of the
economy. However, Indian agriculture faces persistent challenges threatening its
long-term stability, including soil degradation, pest management issues, and
fluctuating crop prices. These challenges create significant uncertainty in crop
yields. To address this, we propose data-driven solutions using machine learning
and deep learning models to improve the accuracy of crop yield predictions.
Machine learning models, such as decision trees, random forests, gradient
boosting, and ensemble techniques like XGBoost, along with deep learning models
like convolutional neural networks (CNNs) and long short-term memory (LSTM)
networks, provide reliable predictions and precise forecasting, enabling farmers
to achieve more stable and optimized yields. Beyond economic benefits, accurate
crop prediction also enhances food security and strengthens rural economies. By
advancing precision in agricultural forecasting, these methods can help tackle
longstanding agricultural issues, contributing to economic growth, increased
profits, and a stable food supply for India. Embracing data-driven approaches is
essential to addressing the evolving challenges of the nation's agricultural
sector. |
Keywords: |
Deep learning, Machine learning, Agriculture, LSTM, CNN, Random Forest, Decision
tree, Gradient Boosting, XGBoost |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
CUSTOMER SEGMENTATION IN THE ONLINE RETAIL INDUSTRY USING BIG DATA ANALYTICS |
Author: |
RONALD S. CORDOVA |
Abstract: |
Nowadays, customer data is abundant due to the growth of the online retail
industry. It allows effective customer segmentation using big data analytics.
This study examines how big data analytics segments online customers. Using
segmentation algorithms and data processing to optimise marketing, sales, and
customer experiences. Different customer segments can be identified by traits
and behaviours. This research examines customer segmentation theory in the
online retail industry. Data collection and preprocessing methods are discussed
to ensure data quality and segmentation relevance. This study will also
demonstrate how customer segmentation strategies can be used to drive digital
marketing and sales campaigns and improve customer experiences using big data
analytics.
Furthermore, this research will demonstrate big data analysis
output using simulations. It will illustrate how big data analytics can segment
customers. This will help online retailers tailor their marketing and sales to
each cluster segment's preferences and behaviour. Results will emphasise data
quality and algorithm choice. This research will conclude with insights on how
online retailers can improve customer satisfaction and business performance
through customer segmentation and big data analytics. The study found better
segmentation methods that allow online retailers to use big data analytics to
segment their customers more precisely. Customer insights are better with
behavioral, demographic, and real-time data. This research shows that big data
analytics can transform online retail by allowing businesses to switch from
static, one-size-fits-all segmentation models to dynamic, data-driven approaches
that better meet digital consumers needs. |
Keywords: |
Customer Segmentation, Big Data Analytics, Online Retail Industry, Machine
Learning, Clustering Algorithms |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
MACHINE AND DEEP LEARNING MODELS FOR MULTI-CLASS SENTIMENT CLASSIFICATION |
Author: |
LAMIAA A. GAAFAR, ATEF Z. GHALWASH, ALIAA A. YOUSSIF, HAITHAM A. GHALWASH |
Abstract: |
Nowadays, Artificial Intelligence (AI) is renowned for embedding human-like
intelligence in computers, enabling them to mimic human behavior. A pivotal
domain within AI is recommendation frameworks, which aid users by suggesting
various choices, thereby facilitating optimal decision-making in contexts like
purchasing items, selecting healthcare services, movies, etc. This paper
introduces classification based on sentiment analysis, aimed at extracting
opinions from user reviews. The analysis employs eight models: five machine
learning models— Extreme Gradient Boosting (XGBoost), Naïve Bayes (NB), Logistic
Regression (LR), Support Vector Machine (SVM), and Random Forest (RF); two deep
learning models—Long Short-Term Memory (LSTM) and a distilled version of
Bidirectional Encoder Representations from Transformers (DistilBERT) transformer
model; and a proposed model integrates Convolutional Neural Networks (CNNs) and
Feedforward Neural Networks (FFNNs), alongside the Mamdani Fuzzy System.
Notably, the LSTM model demonstrates superior performance, especially attributed
to its efficacy in processing shorter sentences, typically ranging from 15 to 20
words as in the used data set, thus slightly outperforming the DistilBERT
transformer model in this context. A comparative analysis between 3-class
(positive, negative, and neutral) and 4-class (strongly positive, positive,
negative, and neutral) classifications reveals the LSTM predominance across all
models. Notably, the Long Short-Term Memory (LSTM) model excels in the 3-class
sentiment classification, achieving an accuracy of 0.99, precision of 0.99,
recall of 0.99, and an F1 score of 0.99 after oversampling. |
Keywords: |
Natural Language Processing, Sentiment, Classification, Deep Learning,
Oversampling |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
IMPROVING SECURITY IN INTELLIGENT SYSTEMS: HOW EFFECTIVE ARE MACHINE LEARNING
MODELS WITH TF-IDF VECTORIZATION FOR PASSWORD-BASED USER CLASSIFICATION |
Author: |
BOUMEDYEN SHANNAQ |
Abstract: |
This research assesses the practicability of machine learning models in
classifying consumers according to their passwords with the help of TF-IDF,
which depicts exclusive password features. The purpose of the study is to
eradicate the weakness of the current EPSB algorithm in its synthesis of
electronic personal behavior. Our goal will be to define those models that have
strengthened the existing methods of password-based authentication. In the
second step, we transformed a data set of anonymized passwords to the
transformation where each was converted into statistical feature vectors using
TF-IDF and tested six models of machine learning. Specific well-known algorithms
used in the course of the study were support vector machines (SVM), random
forests, Naïve Bayes, K-nearest neighbor (KNN), logistic regression, and
decision trees. This cross-validation made me conclude that Naive Bayes
outcompeted all the other models in terms of a greater weighted average
precision of 96.38%, which was higher than the other two models: the SVM model
equal to 91.64% and the logistic regression model equal to 91.52%. With regards
to accuracy, KNN got 79.48%, Decision Tree got 77.55%, while Random Forest
recorded the lowest value of the four techniques at 71.26%. These results
provide a more profound comprehension for the development of an extended
password-based authentication scheme using an advanced machine learning
approach. |
Keywords: |
Password Classification, Machine Learning, TF-IDF Vectorization, Support Vector
Machine, Random Forest, Naive Bayes-Nearest Neighbors, Logistic Regression,
Decision Tree |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
SIMPLE GENETIC ALGORITHM BASED RANDOM TESTING FOR REDUCING FAULTY TEST CASES |
Author: |
D. SREE LAKSHMI, P. BHARAT SIVA VARMA, K. ASHOK KUMAR, DR.Y. SUMANTH, JANJHYAM
VENKATA NAGA RAMESH, G. N. SOWJANYA |
Abstract: |
The goal of software testing is to identify software flaws. Software testing is
the process of confirming that a program works as intended. Test inputs are
generated at random from the software's input space during random testing. to
consistently provide test instances that are random and have some similarities.
We will provide a method for minimizing the errors based on the best test cases
produced by directed random testing in order to get around these problems. Using
the random testing model as a basis, we will create an effective random testing
test case in the suggested approach. The Simple Genetic Algorithm (SGA) will be
used in this study to create the best inputs, minimizing both equivalent and
illicit inputs. SGA makes use of the test case coverage metrics to lessen fault
proneness. By merging the old input with the present one, our suggested approach
will reduce the input space and improve scalability and efficacy in the software
testing age. |
Keywords: |
SGA, Software testing, Coverage, RT |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
DO TECHNOLOGY-RELATED INVESTMENTS HAVE A REMARKABLE IMPACT ON THE SCALABILITY
AND PRODUCTIVITY OF THE FINANCIAL INDUSTRY? |
Author: |
ROCHANIA AYU YUNANDA, TOTO RUSMANTO, NURIL KUSUMAWARDANI SOEPRAPTO PUTRI,
MOHAMMAD ALI TAREQ |
Abstract: |
Digital transformation makes investing in information and communications
technology essential for all industries including the financial industry. In the
Indonesian banking sector as part of the financial industry, technology-driven
banking services were found to increase both banking transactions and customer
numbers. This study investigates whether technology-related investments
significantly enhance the scalability and productivity of commercial banks in
Indonesia. This is quantitative research. The research employs a panel data
regression of 80 observed data during the period 2019-2023. This study found
that spending amount on technology-related investment has an impact on
scalability and productivity. Specifically, technology-related investment
increase revenue growth and productivity (sales per employee). The research
findings have both academic and practical impacts. The results could encourage
the bankers and decision makers to determine appropriate technological
investment strategies. Investors can also consider highly digitalized banks as
valuable investment. This research extends the current literature on the
relationship between digitalization and banking performance, with a focus on
commercial banks in Indonesia. This study offers two other proxies to assess
banking performance: scalability and productivity, as technological advances aim
to enhance company growth and productivity. |
Keywords: |
Technology, Software, Banking, Productivity, Scalability, Indonesia |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
AN INNOVATIVE MACHINE LEARNING FRAMEWORK FOR PHONOCARDIOGRAPHY (PCG) USING MFCC
AND DEEP EXTREME LEARNING MACHINE (DELM) |
Author: |
ABDULLAH ALTAF, HAIRULNIZAM MAHDIN, AWAIS MAHMOOD, ABDULREHMAN ALTAF |
Abstract: |
Cardiovascular Diseases (CVDs) are a significant global cause of mortality,
necessitating effective diagnostic techniques. Phonocardiography (PCG) is among
the fundamental methods used to analyze heart sounds to detect human
heart-related abnormalities. However, in an environment where state-of-the-art
PCG equipment is not available, a Machine Learning (ML) based solution can serve
as a reliable alternative. However, the main challenges faced by ML-based PCG
systems, are the unavailability of balanced and unbiased datasets, the vanishing
and exploding gradient a well-known Deep Learning (DL) issue, and inappropriate
feature extraction techniques, which often compromise the accuracy and
reliability of ML-based PCG systems. This study introduces a novel Deep Extreme
Learning Machine (DELM) and Mel-Frequency Cepstral Coefficients (MFCC) based PCG
framework for CVD diagnosis. The proposed framework uniquely addresses the above
mentioned challenges. The proposed model achieves a remarkable training accuracy
of 98.46 % and a test accuracy of 86.80 %, using the Heartbeat Sound dataset
with five classes and after class aggregation and dataset normalization the
proposed model achieved training accuracy 99.52 % and a test accuracy of 92.30 %
demonstrating its potential in PCG diagnostics. This framework represents a
significant advancement in ML-based PCG systems for automating heart sound
analysis and contributing to improved cardiac healthcare, especially in
resource-limited settings. |
Keywords: |
Cardio Vascular Disease (CVDs), MFCC, Machine Learning, Deep Extreme Learning
Machine (DELM), Heart Disease |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
PROGRESS TRANSFORMER ON ALZHEIMER’S DISEASE PROBABILITY FINDER FROM MILD
COGNITIVE IMPAIRMENT |
Author: |
SARANYA RATHINAM, KRISHNAN NALLAPERUMAL, KALIDASS SUBRAMANIAM |
Abstract: |
The prodromal phase of Alzheimer's disease (AD) is called mild cognitive
impairment (MCI). Effective treatments depend on identifying MCI patients who
have a high chance of transforming AD. This paper proposes a temporal magnetic
resource imaging (MRI) slice feature analysis using transformers to predict the
chance of AD. The proposed Progress Transformer (ProgTransAD) model finds the
relative changes in the MRI slices with the help of encoding the convolutional
backbone feature maps and their corresponding cosine similarity. The proposed
deep learning approach forecasts whether someone would develop Alzheimer's
disease (AD) after receiving a diagnosis of MCI with three years of analysis.
The performance of this unique deep learning network which can accurately
diagnose AD progression is analyzed using the Alzheimer's Disease Neuroimaging
Initiative (ADNI-1) dataset and this ProgTransAD achieves 94% accuracy one year
ahead. |
Keywords: |
Alzheimers disease, Mild Cognitive Impairment, Neuroimaging, Magnetic Resource
Imaging, ProgTransAD |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
Title: |
UNPRECEDENTED HARMONY SEARCH OPTIMIZATION-BASED LEACH ROUTING PROTOCOL
(UHSO-LRP) FOR ENHANCING WIRELESS BODY AREA NETWORKING (WBAN) LIFETIME |
Author: |
S.VEERARATHINAKUMAR , Dr. B.DEVANATHAN |
Abstract: |
Wireless Body Area Networks (WBANs) play a crucial role in healthcare
applications by enabling continuous and non-invasive monitoring of patients'
vital signs and health data through wearable devices. These networks transmit
real-time data from wearable sensors to a central monitoring system, enabling
remote patient monitoring and timely medical intervention. Routing in WBANs is a
critical aspect as it involves the selection of optimal paths for data
transmission among various wearable devices and the central node. Traditional
routing protocols face challenges in WBANs due to the characteristics of the
human body, such as dynamic channel conditions, varying distances between
devices, and energy constraints of wearable devices. Conventional routing
protocols may not efficiently handle these issues, leading to suboptimal
performance, increased energy consumption, and limited network lifetime. The
proposed work, Unprecedented Harmony Search Optimization-Based LEACH Routing
Protocol (UHSO-LRP), aims to address the issues faced in routing within WBANs.
It introduces a hybrid approach that combines the Harmony Search Optimization
(HSO) algorithm with the Low-Energy Adaptive Clustering Hierarchy (LEACH)
routing protocol. The working mechanism of UHSO-LRP involves the utilization of
the HSO algorithm to optimize the selection of cluster heads and the routing of
data packets. The HSO algorithm introduces self-optimizing capabilities,
allowing the network to dynamically adjust its configuration to changing
conditions. UHSO-LRP effectively manages energy consumption and prolongs the
network lifetime by optimizing the cluster head selection and data routing.
Simulations are conducted to evaluate the performance of UHSO-LRP compared to
conventional routing protocols. The simulations analyze key performance metrics,
such as network lifetime, energy efficiency, packet delivery ratio, and latency.
The results demonstrate that UHSO-LRP outperforms traditional routing protocols,
showcasing significant network stability and energy utilization improvements. |
Keywords: |
Wireless Body Area Networks (WBANs), Routing Protocol, Harmony Search
Optimization (HSO), Low Energy Adaptive Clustering Hierarchy (LEACH), Wireless
Sensor Networks (WSN), Healthcare Monitoring |
Source: |
Journal of Theoretical and Applied Information Technology
30th November 2024 -- Vol. 102. No. 22-- 2024 |
Full
Text |
|
|
|