|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Information Technology
December 2024 | Vol. 102
No.24 |
Title: |
DEVELOPMENT OF ONTOLOGY FOR RUBRIC ASSESSMENT USING METHONTOLOGY |
Author: |
NUR FADILA AKMA MAMAT, NOOR MAIZURA MOHAMAD NOOR, ROSMAYATI MOHEMAD, NOOR AZLIZA
CHE MAT, DADABAEV SARDORBEK, FOZILJANOVA MARXABO, KANEEKA VIDANAGE |
Abstract: |
In a variety of fields, including management, medicine, business, education, and
others, sophisticated technology that facilitates access to relevant data is
essential for supporting decision-making and resolving challenging issues. The
absence of explicit representation of knowledge and data modelling through
standards like RDF and OWL continues to plague the rubric evaluation sector and
hinder the effective sharing between expert and general users. In order to
properly characterize important aspects and norms, covering the characteristic
of rubric and knowledge of psychomotor skills level, this article presents the
ontology known as Psychomotor Learning Domain (PLD) ontology. Creating and
developing an ontology model for rubric assessment is the goal. There are
several ways to construct ontologies. A well-defined and organized methodology
can shorten the time needed to construct an ontology, increasing the likelihood
that the project will succeed. METHONTOLOGY is used in the specification,
conceptualization, formalization, implementation, and maintenance phases of the
ontology building process. Adherence to relevant norms and laws has been widely
established during the ontology creation. Only the conceptualization—the process
of organizing knowledge for ontology implementation—is the subject of this
paper. The conceptualization of the tasks in the task set for knowledge
structuring is included in methodology. It makes ontologies buildable at the
knowledge level. The developed ontology is meant to serve as a domain knowledge
base for further programs, including expert systems. |
Keywords: |
Rubric Assessment, Ontology, Higher Education, Psychomotor Learning Domain |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
MAKING THE VISUALLY IMPAIRED FEEL IMAGES USING DLEVIS (DEEP LEARNING EYE FOR
VISUALLY IMPAIRMENTS) |
Author: |
S. MURUGESAN, DR. N. BALAJIRAJA |
Abstract: |
In the world, 25% of the 285 million persons who suffer from visual impairments
(VIs) are totally blind. The thrill of sight is also taken away from people with
VIs. Even through visuals, they are unable to appreciate nature, green spaces,
or surroundings. Furthermore, social media and recent technological advancements
have served as the primary means of bringing people together on a worldwide
scale. Common people's lives are impacted by mobile social media, and their
usage patterns are the subject of in-depth research. Users with VIs as special
user groups, however, are frequently disregarded. Few studies have been done to
find out how they connect with popular mobile social media today. By proposing a
method that interprets images and provides information to individuals with VIs,
this study aims to close these gaps and enable their introduction as a unique
user group on social media. The suggested method, which is based on DLTs, is
dubbed DLEVIs (Deep Learning Eye for Visually Impairments). The accuracy of the
scheme is also assessed in the study, where the findings show relatively low
categorization mistakes. |
Keywords: |
Deep Learning, Visually Impaired, Mobile Social media, Captioning Images,
Social Interactions, Social Networking Sites |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
THE USE OF BIG DATA IN THE DETECTION OF ECONOMIC CRIMES IN PUBLIC PROCUREMENTS |
Author: |
HASSAN ALI AL-ABABNEH, YAROSLAV FEDORCHUK, ANDRIY TYMCHYSHYN, GENNADY
PISHCHENKO, SERHII HRYTSAI |
Abstract: |
The article deals with the role of digital transparency tools in the fight
against corruption, especially in public administration. The research aimed to
assess how such technologies as blockchain, open government data, e-procurement,
and e-government services increase transparency and accountability while
reducing corruption risks. The study fills a gap in the literature by proposing
a new approach to combating corruption through digital solutions. The study
covers 10,000 procurement contracts from the European Union (EU), using modern
methods such as Big Data (BD) analytics, statistical analysis, simulation model.
These approaches have significantly improved the detection of fraud-related
anomalies compared to traditional methods based on manual checks and limited
data. The study compares the effectiveness of digital transparency initiatives
in several countries, revealing the conditions under which these tools are most
effective. Besides, such challenges as insufficient digital literacy and
barriers to implementation are discussed. The results showed that digital
transparency helps to reduce corruption by improving access to information and
simplifying processes, but its success depends on political and economic
infrastructure. The article provides recommendations for policymakers on the
effective use of digital tools in different governance systems, significantly
contributing to understanding the interaction of technology and anti-corruption
reforms. |
Keywords: |
Economic Crimes, Public Procurement, Fraud, BIG DATA, Network Analysis. |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
IMPROVED POWER OPTIMIZATION TECHNIQUES IN MOBILE AD HOC NETWORK |
Author: |
S. HEMALATHA, TAVANAM VENKATA RAO, G.S. N MURTHY, KHADRI SYED FAIZZ AHMAD, P.
SUPRIYA, DR. CHARANJEET SINGH |
Abstract: |
Mobile Ad hoc Networks (MANETs) are dynamic, self-organizing networks vital for
communication in scenarios without established infrastructure, such as disaster
recovery and military operations. However, MANETs face critical challenges,
including power management, collision issues, and routing inefficiencies caused
by hidden and exposed terminal problems. This study proposes an integrated
solution combining Beam Sector Directional Antennas with a Mutual Exclusion
Medium Access Control (ME-MAC) protocol. The proposed system optimizes power
consumption and improves network performance through precise beam-based packet
transmission and collision avoidance mechanisms. Simulations conducted using
NS2.32 reveal significant improvements over traditional Omni-Directional
Antennas. Specifically, signal power consumption is reduced by 50%, collision
rates by 25%, energy consumption by 30%, and end-to-end delay by 45-50%.
Furthermore, the PMBS-Antenna achieves 15% higher transmission speed, 10% better
efficiency, and a 40-50% increase in packet delivery ratio. These findings
demonstrate the potential of the proposed approach to address longstanding MANET
challenges, paving the way for more efficient and reliable communication
systems. |
Keywords: |
MANET,Antenna, Physical Layer, MAC layer, Hidden and Exposed Terminal (HET)
problem, ME-MAC, protocol |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
ENHANCING CLARITY IN CHINESE SOFTWARE REQUIREMENTS: A BOILERPLATE-BASED APPROACH
TO IMPROVE REQUIREMENT EXPRESSION |
Author: |
HE JIAYING, MOHD HAFEEZ OSMAN, SA'ADAH HASSAN, NG KENG YAP |
Abstract: |
During the rapid process of software development, requirements engineering plays
a crucial role in ensuring the success of software projects. Clear and accurate
requirements specification directly impact the effectiveness of project
implementation, influences customer satisfaction, and ultimately determines the
overall success rate of the project. However, ambiguity and structural issues
are common in Chinese software requirements, which can hinder understanding and
implementation. These issues not only increase the complexity of project
execution but may also lead to extended development cycles and wasted resources.
To address these challenges, this paper aims to develop a set of Chinese-based
software requirement boilerplate to improve the clarity and accuracy of
requirements specification. The boilerplate is designed with consideration for
Chinese grammar conventions and international standards, offering a universal
and user-friendly solution. This paper reviews related literature, discuss the
limitations of current requirements specification method, and details the
boilerplate design process and evaluation methodology. The effectiveness of the
boilerplate is demonstrated through the reconstruction of requirements
statements across various software industries, complemented by expert
assessments and feedback from industry professionals. This study provides a
practical foundation for Chinese software requirements specification, aiming to
enhance the quality and efficiency of requirements expression. |
Keywords: |
Software Requirements Engineering, Chinese Requirements Expression, boilerplate
Design, Clarity, Requirements Normalization |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
DEEP LEARNING-BASED CONGESTION CONTROL IN VLSI FOR PLACEMENT AND ROUTING |
Author: |
SHAIK ASIF HUSSAIN, SHAIK KARIMULLAH, FAHIMUDDIN SHAIK, SYED JAVEED BASHA |
Abstract: |
This work presents the application of a recurrent neural network based on the
deep learning technique to evaluate design constraints for routing and placement
flow, with the assistance of existed floorplan approach, to estimate the value
of congestion. Effective area utilization is essential in very large-scale
integration (VLSI) circuit design, wherein congestion reduction is also
associated with improving floorplan, placement, and routing. This improvement
significantly helps a circuit’s compact design and performance. Congestion is a
fundamental key issue in VLI for estimating the density of area underlies
routing among various computational blocks. We used the deep learning technique
for the simulation of the standard benchmark circuit’s planned area for better
placement. Prior approaches estimated the values of congestion for standard
architectures, whereas this work considered the floorplan and placement outputs
of standard MCNCBM Circuit using the recurrent neural network-based algorithm
(RNN-based algorithm), which yielded better results for placement and routing of
VLSI circuits and simulated it to estimate the congestion for the circuit
design. A recurrent neural network, an artificial neural network, uses
sequential data or time series data. To handle problems that involve order or
time, such as image captioning, language translation, and speech recognition,
deep learning algorithms are frequently utilized. Learning occurs through the
training data in RNNs, just as it does in feed-forward RNNs and convolutional
neural networks (CNNs). The information "remembered" from earlier inputs can
then be used to modify the current input and the output. This sets them apart
from other systems. As they are currently understood, deep neural networks work
under the assumption that inputs and outputs are independent of one another. On
the other hand, the output of recurrent neural networks is, nevertheless,
influenced by the information processed in the network before it. Although the
occurrence of events in the future might help in determining the outcome of a
particular sequence, unidirectional recurrent neural networks cannot consider
these occurrences when generating predictions. |
Keywords: |
Congestion, Placement, RNN-based algorithm, VLSI, Partial blockage technique. |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
EMPIRICAL SOUNDINGS TO FACIAL COUNTENANCE RECOGNITION USING CNN |
Author: |
DR.S. SAI KUMAR, DR.SURYA PRASADA RAO BORRA, DR.YASWANTH KUMAR ALAPATI, P UDAYA
BHANU, DR.K. RAMESH CHANDRA, BAKKALA SANTHA KUMAR |
Abstract: |
There are many uses for the devices, and the number of uses is growing every
day. Machine perception will assist in carrying out a variety of activities,
including intricate ones. Machine perception enables machines to comprehend both
their physical surroundings and the conversation partner's intentions. In this
study, we classified the images into categories such as happiness, sadness,
anger, amazement, dislike, and anxiety using deep learning techniques such
convolution neural networks. This method is used because CNN produces better
results than other statistical techniques. Using CNN requires feature learning,
which is a crucial task. Additionally, the community was assessed using two
corpora: one was used for community education and the other was used for
defining the structure of a network. The network that produced results with
first-class accuracy was compared to the second dataset. When tested on a unique
fact set that shows facial emotions, the network mentioned favourable outcomes
even though it had been trained using the best corpus. Although the results
showed that the network was no longer king, the evidence shows that deep
learning is likely appropriate for categorizing facial emotion expressions. As a
result, deep mastery can enhance human-system connection since its learning
skills will enable machines to perceive more. |
Keywords: |
CNN, Open CV, Facial Expression. |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
LSTM ADAPTIVE HYPERPARAMETER TUNING FOR FINANCIAL TIME SERIES FORECASTING USING
CUSTOM GRADIENT-BASED METHODS: A COMPARATIVE STUDY WITH BAYESIAN OPTIMIZATION |
Author: |
ADNANE EL OUARDI, BRAHIM ER-RAHA, KHALID TATANE |
Abstract: |
This paper presents an adaptive hyperparameter tuning system for Long Short-Term
Memory (LSTM) models, focusing on a gradient-based approach to achieve efficient
and precise optimization. The proposed system is designed to dynamically adjust
critical hyperparameters such as the learning rate, number of units, dropout
rate, and batch size during the training process. By leveraging gradient
information, the system iteratively refines the hyperparameter values, aiming to
minimize the Mean Squared Error (MSE) and enhance the model's predictive
accuracy. To evaluate the effectiveness of the gradient-based approach, the
study includes Bayesian optimization as a benchmark, a method widely recognized
for its probabilistic framework and established success in hyperparameter
tuning. Comparative analysis is conducted in terms of loss metrics, including
MSE and additional performance indicators, as well as execution time,
highlighting the trade-offs between optimization efficiency and computational
cost. The results demonstrate the gradient-based system's ability to adapt to
high-dimensional and complex hyperparameter spaces with reduced computational
overhead, while consistently achieving superior performance. The experiment is
applied to the challenging task of forecasting the S&P 500 index, a real-world
financial time series problem that demands robustness and precision. |
Keywords: |
Hyperparameter Optimization, Gradient-Based Tuning, LSTM Networks, Bayesian
Optimization, Time Series Forecasting |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
CLASSIFICATION OF CORN LEAF DISEASES USING CNN: A DEEP LEARNING APPROACH |
Author: |
HANDRIZAL, FUZY YUSTIKA MANIK, VICTORY J SIANTURI |
Abstract: |
One of the major problems that needs to be addressed is the identification of
infection in corn leaves. The quality and production of crops can decrease due
to the presence of diseases on corn leaves. The process of diagnosing diseases
on corn leaves manually takes time and effort. Therefore, a more effective and
accurate technique is needed to know the existence of diseases on corn leaves.
In this study, a Convolution Neural Network (CNN) is used as a technique for
identifying diseases on corn leaves. A machine learning model called CNN can be
used to identify characteristics in images. To train CNN, a dataset of corn leaf
images labeled with the names of predetermined corn leaf diseases is used. The
accuracy rate that has been achieved is 95%, with a precision of around 95%. The
recall rate also reaches 95%, while the F1 score reaches 95%. This method of
identifying diseases on corn leaves using CNN can be used to improve the
efficiency and accuracy of disease identification on corn leaves. This method
can also be used to develop an early warning system for diseases on corn leaves. |
Keywords: |
Disease Identification, Corn Leaf, Convolution Neural Network, Accuracy |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
AN ENERGY AND AREA EFFICIENT IOT ARCHITECTURE FOR BIO-MEDICAL APPLICATIONS |
Author: |
P V GOPIKUMAR, C RAVI SHANKAR REDDY, R MANIKANDAN, T V HYMALAKSHMI, G
VENKATESWARLU |
Abstract: |
Heart Stroke, cardiac arrest are more prominent deceases assassinating the
harmony of the people. Meticulous detection of the abnormalities and myocardial
infarctions leads to abrupt change in the functionality of the heart. QRS
Complex amplitude and R wave amplitude are significant to detect the
abnormalities and sudden cardiac arrest. This paper focused to develop an
architecture to meet the challenges of IOT enabled wearable devices. Absolute
value curve length transform (A-CLT) is implemented to detect the QRS complex
Changes. The proposed methodology nullifies the multipliers and performs well
with adders, shifters and comparators. Eventually, the packing density (area) is
minimized. This will improve the processing time and minimizes the dissipating
power. This paper addressed the complexity in early detection of the strokes and
cardiac arrests by analyzing the QRS complex of the ECG signal. Base line drift,
high frequency interference (artifacts) are impacting on the signal generation.
The proposed methodology curtails those artifacts and improves the performance
of the signal detection and interpretation. The area is miniaturized with the
A-CLT approach. 93.36 percent of the area is reduced with A-CLT. 77.61 percent
of power is minimized with A-CLT methodology. The computation delay is reduced
to 79.64 percent. This paper also addressed the sensitivity and predictivity of
the QRS complex amplitude meticulously. 99.46 percent predictivity and 99.24
percent sensitivity is achieved with the proposed methodology. The achieved
results are validated with the physician and specifically, achieved lossless
compression for enhancing the derivative of ECG Signal and entropy encoding. It
is observed that the compressed fraction is 2.05 and is validated with MIT-BIH
database. The proposed methodology is surpassing the existing methods. The
achieved results proved that this A-CLT applied architecture is best fit for
wearable devices to prevent abrupt changes in cardiac functionality and to
safeguard human from sudden cardiac arrests. |
Keywords: |
Absolute Value Curve Length Transform (A-CLT), QRS Complex, Quadratic Spline
Wavelet Transform, Electrocardiography (ECG) |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
PROPOSAL OF ENHANCING WATER SAFETY- AN AUTONOMOUS ROBOT FOR DROWNING PREVENTION |
Author: |
S. HEMALATHA, DR. KIRAN MAYEE ADAVALA, S.N. CHANDRA SHEKHAR, PULLELA SVVSR
KUMAR, A.R. VENKATARAMANAN, D NAGA MALLESWARI |
Abstract: |
This paper presents the development of an advanced image recognition robot,
dubbed the "drowning robot," designed to identify individuals at risk of
drowning using state-of-the-art hardware and software. The primary aim of the
project is to enhance water safety. The system leverages powerful computing
platforms, such as the NVIDIA Jetson and Intel NUC, paired with high-resolution
cameras and specialized sensors to capture and process real-time video data. By
employing image processing technologies like OpenCV and deep learning models
such as YOLO (You Only Look Once), the robot can detect human figures and
unusual movements in aquatic environments. Operating autonomously, the robot
offers a reliable solution for emergency response scenarios and can connect to
cloud services for further verification. Key performance metrics, including
FLOPS, latency, and frames per second (FPS), are assessed to ensure optimal
processing speed for quick detection and action. This cutting-edge technology
represents a major advancement in both robotics and safety engineering, with the
potential to significantly improve rescue operations during drowning incidents
and provide timely alerts to enhance public safety. |
Keywords: |
Drowning Detection, Robotics, YOLO, OpenCV, Object Detection, Surveillance
Technology |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
FACTORS INFLUENCING CUSTOMER PURCHASE DECISIONS DURING LIVE STREAMING SHOPPING |
Author: |
MOH THAHA RIZIEQ HENTIHU, RIYANTO JAYADI |
Abstract: |
The purpose of this study is to identify the factors influencing customer
purchase decisions during live streaming shopping using a quantitative research
method. Data was collected through a questionnaire distributed to 441
respondents, of which 433 had previously shopped via live streaming. The results
indicated that 2 hypotheses were rejected while 10 were accepted. The rejection
of hypothesis H2 indicates that Interactivity does not have a significant impact
on Trust, and the rejection of hypothesis H4 shows that Visualization also does
not significantly impact Trust. Interactivity, Visualization,
Professionalization, System Quality, Information Quality, and Service Quality
influence the Social Presence of Live Streaming Shopping. Professionalization
impacts Trust. Both Social Presence of Live Streaming Shopping and Trust affect
Purchase Decision, which in turn influences Purchase Intention. |
Keywords: |
Live Streaming Shopping, Social Commerce, Purchase Decision, Consumer Behavior,
E-Commerce |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
CAN AI-DRIVEN FEATURES ALONE FOSTER CUSTOMERS’ TRUST IN INDONESIAN AND MALAYSIAN
BANKS? |
Author: |
ROCHANIA AYU YUNANDA, TOTO RUSMANTO, MOHAMMAD ALI TAREQ, NURIL KUSUMAWARDANI
SOEPRAPTO PUTRI |
Abstract: |
Artificial intelligence (AI) has rapidly transformed numerous industries,
including banking. Despite global economic challenges, the banking sectors in
Indonesia and Malaysia have shown resilience and growth, actively adopting
digital technologies. This study examines the effect of AI features on trust in
banking industry of both countries. Besides AI features, this study explores
other predictive factors such as income, gender, experience in using banking
services and also frequency in dealing with banking services. Questionnaires
were distributed to banking customers in Indonesia and Malaysia. Multiple
regression analysis were performed using SmartPLS 4. The findings reveal that AI
feature alone do not affect customer’ trust. Some demography factors also play
the important roles. The implications of this study are significant for both the
banking sector and policymakers. By highlighting the role of AI in enhancing
customer trust, the study underscores the importance of integrating advanced
technologies to improve operational efficiency and customer experience. Prior
studies focus on examining the determinants of technology adoption. This study
is one among a very limited number of studies scrutinizing the adopted
technological features on the customers’ trust in banking industry. |
Keywords: |
AI Features, Banking, Trust, Income, Banking Experience, Frequency, Indonesia,
Malaysia |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
DYNAMIC GRID-BASED CLUSTERING FOR NON-STATIONARY SPATIO-TEMPORAL EVENT
PREDICTION |
Author: |
M.VASAVI, A.MURUGAN, ,K.VENKATESH SHARMA |
Abstract: |
Spatio-temporal data is increasingly collected in domains such as urban
planning, traffic analysis, and environmental monitoring, in which events change
quickly over time as well as across different regions. Traditional
spatio-temporal clustering models cannot handle the characteristics of
non-stationary processes with quick and unpredictable changes. To overcome these
limitations, this study presents a dynamic grid-based clustering method that
adjusts its spatial and temporal parameters on the fly to improve the accuracy
of prediction and computational efficiency for non-stationary spatio-temporal
event analysis. We iteratively update grid sizes and time intervals by
considering the density of events and movement patterns of objects to capture
easily unseen clusters over time with minimum computing cost. The performance of
the proposed method was evaluated on two real-world datasets (i.e., urban
traffic data and environmental monitoring data) and compared with not only
several modified versions of baseline models like DBSCAN or Spatio-Temporal
k-Nearest Neighbor (STKNN) but also a naïve approach based on grid concept. The
results showed that our proposed dynamic grid-based model outperformed other
approaches in terms of both clustering quality (with Silhouette Coefficients
value 0.82) and computational time (1.8 seconds), but had comparable error-rate
prediction results (mean square error is 0.015). These achievements confirm that
our method can adapt to changes in real-time processing environments to meet the
needs for continuous event predictions. |
Keywords: |
Dynamic Grid-Based Clustering, Spatio-Temporal Data, Non-Stationary Events,
Real-Time Prediction, Traffic Monitoring, Environmental Monitoring. |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
KNOWLEDGE DISCOVERY IN DATABASES FOR HOTEL SERVICE QUALITY IMPROVEMENT THROUGH
DATA-MINING APPROACH |
Author: |
YERIK AFRIANTO SINGGALEN, SIH YULIANA WAHYUNINGTYAS, YOHANES EKO WIDODO, MUHAMAD
NUR AGUS DASRA, RUBEN WILLIAM SETIAWAN |
Abstract: |
This study integrates knowledge discovery in databases (KDD) with sentiment
analysis techniques to evaluate customer feedback and improve service quality in
the hotel industry. The research emerges from the growing demand for data-driven
strategies in the highly competitive hospitality sector, where understanding
customer sentiment is crucial for enhancing guest satisfaction and retaining
market share. Machine learning models, including Support Vector Machine (SVM),
Decision Tree (DT), Naive Bayes Classifier (NBC), and k-nearest Neighbors
(k-NN), were employed to extract insights from unstructured text data,
identifying key factors that influence guest satisfaction. Results indicated
that the SVM model achieved the highest accuracy of 95.4%, with a precision of
93.22% and recall of 95.4%, showcasing its robustness in sentiment
classification. In contrast, NBC showed lower effectiveness with 79.09%
accuracy, while k-NN faced challenges in complex data scenarios, achieving
60.71% accuracy and an F-measure of 49.99%. The findings suggest that
integrating sentiment analysis into hotel management practices can boost
customer satisfaction by 20-30%, particularly in staff interaction and facility
maintenance. This research presents a novel approach by combining KDD
methodologies with qualitative sentiment analysis, moving beyond traditional
quantitative metrics to provide deeper insights into customer experiences.
However, reliance on online reviews as the primary data source may introduce
biases, potentially affecting the generalizability of results. Future research
should expand data collection to include structured surveys and direct
interviews, incorporate deeper semantic analysis, and utilize real-time
sentiment monitoring to enhance service management strategies. This approach
could lead to sustainable competitive advantage and continuous improvement in
the hospitality industry. |
Keywords: |
KDD, Hotel, Service Quality, Data Mining |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
APPLYING BLOCKCHAIN TO ENHANCE IN-GAME ECONOMIES IN MULTIPLAYER FPS GAMES |
Author: |
ANATHA PINDHIKA HERMAWAN, MARIA SERAPHINA ASTRIANI |
Abstract: |
The contribution of blockchain technology, in particular the Ethereum platform,
for enhancing the security, transparency, and efficiency of the transactions
happening in multiplayer FPS games, has been examined. In this paper, a
blockchain-driven real-time transaction recording system is proposed and
implemented based on the Ethereum PoS consensus algorithm to minimize fraudulent
activities and data tampering. Extensive testing demonstrated that the
integration of blockchain significantly strengthened security measures and
transaction accuracy, while maintaining high game performance and seamless
player experience. By addressing two of the most common issues found in online
gaming-cheating and vulnerability to centralized data-this study shows how
blockchain can truly change the face of game economies. The results highlight
the feasibility of integrating blockchain technology to provide a decentralized
and secure transaction framework, which will further help in creating fairer and
more transparent gaming contexts. Further studies should address the scalability
of blockchain marketplaces and its impact on player engagement and satisfaction,
adding to the growing debate on the importance of blockchain in game
development. |
Keywords: |
Blockchain, Game, First Person Shooter, Ethereum, Game Development, |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
COGNITIVE MODELS OF PATENT TRANSLATION |
Author: |
LADA BERDNYK, TETIANA VYSOTSKA, SVITLANA KOROTKOVA, NATALIA MOSKALENKO, OLESIA
CHERKASHCHENKO |
Abstract: |
The aim is to assess the effectiveness of cognitive models of translation for
the accuracy, speed, and quality of patent translation. The research employed
such methods as cognitive modelling, error analysis and the method of contrast
analysis. Traditional metrics of translation quality, such as Bilingual
Evaluation Understudy and Recall-Oriented Understudy for Gisting Evaluation,
were used in the study. The chi-squared test was also used. The results of the
study showed that the GPT-3 model with BLEU 0.85 and ROUGE 0.88 showed the best
quality indicators of patent translation, providing the highest accuracy and
smoothness of translation. The BERT model also performed well with BLEU 0.82 and
ROUGE 0.85, preserving the structure and semantics of the original. In contrast,
the LSTM and GRU models had lower values — BLEU 0.65 and 0.68, respectively,
indicating difficulties with the accuracy of translating specific terms. The
study revealed that the GPT-3 model provides the highest translation accuracy of
patent texts, followed by BERT. LSTM and GRU models showed medium results,
indicating the need for their further optimization. The results confirm the
importance of choosing the appropriate model for specific translation tasks.
Further research may address issues of improving the quality of text translation
using AI tools. |
Keywords: |
Neural Networks, Chatgpt, Google Gemini, Translation Quality, Error Analysis,
Patent Texts. |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
APPLICATION OF THE PIECES FRAMEWORK METHOD IN THE ANALYSIS OF USER SATISFACTION
LEVELS OF OASIS APPLICATION SERVICES |
Author: |
IKE KUSDYAH RACHMAWATI, THERESIA PRADIANI, AGUS RAHMAN ALAMSYAH, ABD HADI,
SYARIF HIDAYATULLAH |
Abstract: |
This study aims to analyze the level of user satisfaction of the OASIS (Online
School Information System Administration) application service using the Pieces
Framework method. This method includes six dimensions: Performance, Information,
Economics, Control, Efficiency, and Service, which allows for a thorough
evaluation of information systems (Hawkins, 2020). A quantitative approach and
survey design were used to collect data from 40 respondents consisting of
students, teachers, and education staff in Malang Regency. The specially
designed questionnaire uses the Likert scale to measure satisfaction levels. The
results of the analysis show that all domains in the Pieces Framework achieved
an average score above 4.38, with the Service domain recording the highest score
of 4.56. The average total user satisfaction reached 4.43, which is included in
the category of "VERY SATISFIED." Although most respondents are satisfied, there
is negative feedback that can be used to improve the quality of OASIS app
services. These findings provide valuable insights for developers to improve the
user experience in the future. |
Keywords: |
Satisfaction Level, Pieces Framework, OASIS Application, School |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
ACRITICAL REVIEW OF SOFTWARE QUALITY MODELS IN ACADEMIC INFORMATION SYSTEM |
Author: |
ASMAA JAMEEL AL-NAWAISEH, FADI BANI AHMAD, SABAH JAMIL AL-NAWAISEH |
Abstract: |
In the contemporary educational landscape, the quality of Academic Information
Systems (AIS) like E_learning System plays a pivotal role in enhancing the
efficiency and effectiveness of educational institutions. This study examines
the critical factors that influence the quality of AIS, including system
usability, data accuracy, user satisfaction, and the impact on academic
performance. Through a comprehensive analysis of existing literature and
empirical data, the research highlights the significance of robust AIS in
supporting administrative functions, academic processes, and decision-making.
The findings underscore the necessity for continuous evaluation and improvement
of these systems to ensure they meet the evolving needs of students, faculty,
and administrators. Furthermore, the study provides practical recommendations
for educational institutions to optimize their AIS, thereby fostering an
environment conducive to academic excellence and innovation. |
Keywords: |
AIS, Software quality, E_learning, Models, Information System, ISO 9126, CMMI |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
ASSESSING GAMIFICATION GAP IN SPORT APPLICATION FOR INCREASED USER MOTIVATION:
AN OCTALYSIS MODEL APPROACH |
Author: |
WILLIAM YEO , TANTY OKTAVIA |
Abstract: |
This study investigates the rapid decline in physical activities among
Indonesian populations and explores how such a trend contrasts with the overall
growth of electronic sports that utilizes gamification to attract user
participation and commitment over an extended period. There is numerous gamified
sport applications created to increase exercise participation; however, the
continued decline is representative of deficiencies in those applications'
motivational frameworks. This work, using the Octalysis model, delivers an
in-depth analysis of user motivation into eight core gamification drives, based
on the results of a survey presented to 101 users of sports applications. The
core drives related to intrinsic motivation—in particular, accomplishment and
empowerment—present high effectiveness; however, there are still important
deficiencies in social engagement, creative expression, and unpredictability
elements. These points have serious implications for an urgent need for targeted
refinements in gamification strategies to deliver a more engaging, adaptive,
socially interactive experience that is better at sustaining user motivation.
Indeed, such strategic adjustments have the potential not only to improve
retention rates and further enhance user experience but also to contribute
toward broader public health challenges through the fostering of sustainable
physical activity patterns. This study gives some very important recommendations
for app developers and policymakers seeking innovative solutions that are
digitally driven to combat sedentary lifestyle practices, therefore helping to
advance health and wellness objectives for a diverse user population across
Indonesia. |
Keywords: |
Sport Application, User Motivation, Gap Analysis, Gamification, Octalysis
Framework |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
DETECTION OF ABNORMAL FEATURES IN TRANSACTION DATA FOR MALICIOUS ATTACKS USING
HIERARCHICAL NETWORK FEATURE EXTRACTION |
Author: |
DAMODHARAN KUTTIYAPPAN, DR RAJASEKAR V |
Abstract: |
The growing number and complexity of financial transactions have made the
detection of fraudulent activity and cyberattacks a considerable issue for
organizations. Conventional rule-based systems and statistical techniques
frequently fail to detect complex assaults that masquerade as ordinary
transactions. This work presents a deep learning methodology for detecting
anomalous aspects in transaction data to efficiently identify potential threats.
Feature extraction technique has been used to find the suitable features for
further processing the dataset. Deep learning techniques are used to classify
transactions as normal or anomalous. This study introduces a deep learning
methodology for detecting anomalous transaction data, using advanced techniques
like CNNs, RNNs, and Autoencoders to identify potential threats. The model
outperforms conventional detection techniques in precision, recall, and
F1-score, providing insights into abnormal behavior, and aiding in attack
discovery and mitigation. The system evolves through ongoing learning, enhancing
its detection precision for changing assault patterns. |
Keywords: |
Deep Learning, CNN, RNN, Anomalies, Threats, Feature Extraction, Financial
Transaction |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
VIRTUAL REALITY AND INTERACTIVE TECHNOLOGIES IN CONTEMPORARY ART: AN ANALYSIS OF
CREATIVE OPPORTUNITIES |
Author: |
NATALIIA YUHAN, LARYSA KORNYTSKA2, DENYS SUCHKOV, ZOYA ALFOROVA, VALERIIA BOIKO |
Abstract: |
This article aims to explore the transformative impact of virtual reality (VR),
artificial intelligence (AI), and interactive technologies on contemporary art.
This study provides a new perspective on how these technologies redefine art as
an immersive, responsive experience tailored to individual viewer interactions.
This research is conducted through a comprehensive literature review of existing
studies, articles, and documented cases in which VR, AI, and interactive
technologies play a role in artistic practices. Critical studies analyzed
include those emphasizing AI's capacity for generating personalized content and
VR's ability to create fully immersive environments. The originality of this
article lies in its combined focus on VR, AI, and interactivity in contemporary
art. By synthesizing these three areas, the article presents an innovative
theoretical framework for understanding immersive experiences and contributes to
rethinking the viewer's role as an active participant. The findings indicate
that these technologies are reshaping art by enhancing engagement and redefining
the creative process, offering new modes of collaboration between artists and
technologists. Ultimately, this research expands our understanding of
integrating modern technology in art, suggesting new approaches to artistic
experience and the shifting dynamics between artist, artwork, and viewer. |
Keywords: |
NFT art, Digital art, Interactive installations, Virtual expositions, Creative
technologies, Artistic innovations. |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
METHODICAL REVIEW OF MUTATION TESTING FOR SOFTWARE PROJECTS |
Author: |
MADHAVI KATAMANENI, DR C.S.S. ANUPAMA, CHETLA CHANDRA MOHAN B. BALAJI BHANU, P
SWETHA NAGASRI, DR.J MANO RANJINI, PARUCHURI RAMYA, M BALA CHENNAIAH |
Abstract: |
A fault-based testing technique that has been extensively researched for more
than thirty years is mutation testing. Change testing is a rigorous, complex,
and expensive testing approach. This testing method intentionally injects
incorrect lines of code to test programming's ability to produce results that
differ somewhat from the correct or original code. It is a method that ensures
the quality of test information by examining if it can identify a set of
replacement projects by addressing specific types of defects in the program
being tested. Since change investigation is widely regarded as an excellent
testing strategy, it is commonly employed to evaluate the test criteria in terms
of its transformation ampleness score. The writing on Mutation Testing has
contributed an arrangement of methodologies, instruments, improvements, and
exact outcomes. This paper gives a complete examination and review of change
testing. This investigation gives confirm that Mutation Testing strategies and
apparatuses are achieving a condition of development and appropriateness, while
the point of Mutation Testing itself is the subject of expanding premium. |
Keywords: |
Mutation Testing, Mutant, Mutant Adequacy Score, Syntax Errors, Cost, Mutant
Operators |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
NEMAEP: A NOVEL ENSEMBLE MACHINE LEARNING FRAMEWORK FOR ACCURATE EFFORT
ESTIMATION IN SOFTWARE PROJECTS |
Author: |
PRATEEK SRIVASTAVA, NIDHI SRIVASTAVA, RASHI AGARWAL, PAWAN SINGH |
Abstract: |
The increasing complexity of software engineering projects has made accurate
effort estimation a formidable challenge. We introduce the NEMAEP framework to
address existing methods shortcomings and enhance precision. This framework
integrates ensemble learning techniques, specifically the CatBoost gradient
boosting algorithm, with Grid Search cross-validation for optimizing
hyperparameters. Our robust predictive methodology substantially improves the
accuracy and reliability of software effort estimation. We evaluated NEMAEP
against well-established regression techniques, including support vector
regressor, decision tree, random forest, and multi-layer perceptron. Two
datasets were utilized: Cocomo81, comprising 63 projects, and a more extensive
China dataset, containing 499 projects. We assessed performance using MAE, RMSE,
and R². The NEMAEP methodology demonstrated superior predictive capabilities,
achieving accuracy rates of 97% for Cocomo81 and 99% for the China dataset. When
applied to the China dataset, the model produced MAE and RMSE scores of 0.0154
and 0.0213, respectively. In Cocomo81, the values were 0.013 and 0.0641. This
innovative strategy allows for the effective maximization of resource
utilization in software project management. This is achieved by providing
project managers with a data-driven tool to navigate modern software development
projects. |
Keywords: |
Software Project Management, Software Effort Estimation, Machine Learning,
Ensemble Learning, Grid Search |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
EVALUATING THE IMPACT OF 3D PRINTERS FOR CREATING INNOVATIVE POSTER DESIGNS |
Author: |
YEVHEN GULA, OKSANA MAZNICHENKO, SVIATOSLAV PODLEVSKYI, ALLA OSADCHA, YELYZAVETA
DEREVIANKO |
Abstract: |
The aim of the study is to substantiate the optimal choice of a software for
prototyping and UI/UX design. The aim of the study is to determine the features
of using 3D printing, while focusing on the creation of innovative poster
designs. The research methods at the initial stage of the study were methods of
comparison, calculations of the correlation coefficient, analysis of variance,
t-test, calculating Pearson coefficient. It was determined that innovative
approaches to creating poster designs are abstract shapes (t=1.783), the use of
elements created by artificial intelligence (t=1.772). Illusionary design
elements (t=1.743), combinations of mismatched parts (t=1.756), and minimalism
(t=1.732) can also be taken into consideration. It was found that the 3D printer
from Mcor Technologies has the greatest efficiency in CD printing. The process
is related to the realism of the created drawing (93%), the quality of the color
scheme (95%), and the scope of application (93%). With the help of the Anycubic
Mega Pro printer, it is possible to convey the quality of the composition
structure (92%) while providing multi-color printing. It was established that
when developing a layout, for high-quality printing it is imperative to take
into account color (t=1.783), design elements’ dimensions (t=1.770) as well as
the contour complexity (t=1.765). It was found that the obtained findings have a
positive practical impact on prospective designers, which is due to the
achievement of high results by students of Group 1 in the creation of design
projects for 3D printing (92 points). The research objectives are focused on
comparing the excellence of design projects when employing interactive versus
traditional methods in their creation. |
Keywords: |
Interactive Technologies, Design Project, Three-Dimensional Elements, Abstract
Shapes, Color Scheme. |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
REINFORCING QUERY-BASED SURVEILLANCE SYNOPSIS SECURITY THROUGH DISTRIBUTED
COMPUTING |
Author: |
HASSAN I. SAYED AHMED, RASHA SHOITAN, GHADA F. ELKABBANY, MONA M. MOUSSA,
MOHAMED S. ABDALLAH, YOUNG-IM CHO |
Abstract: |
Video synopsis facilitates the efficient analysis of surveillance footage by
condensing extended video content into shorter segments and reorganizing moving
objects temporally based on a predefined objective function. Various approaches
have been developed to address challenges in video synopsis, such as object
collisions, maintaining chronological order, reducing computational time, and
managing scene complexity by grouping activities based on user-defined queries.
However, many query-based detection techniques often overlook specific tasks
like detecting intrusions, such as unauthorized entry, identifying individuals
wearing unauthorized clothing, or detecting person substitution. This research
extends one of the conventional query-based approaches by characterizing each
object tube through its motion and visual features to enhance the detection of
such intrusions. Moreover, it implements query-based video synopsis generation
using a distributed architecture that utilizes multiple servers to boost
performance. While the distributed model effectively mitigates and improves the
query-based computational performance, it introduces potential security risks,
including object tube substitution during cross-device transfers and the
possibility of tube loss. The proposed system contributes by building a
distributed system architecture that reduces computational complexity and
provides a comprehensive solution that balances between efficiency, security,
and reliability in handling surveillance video synopsis. It offers a two-layer
security model; the first layer embeds a watermark within each object tube to
safeguard against substitution during processing. The second layer serializes
each object tube, ensuring no loss occurs during transmission, thus preserving
the integrity of the entire process. This novel approach integrates distributed
computing, watermarking, and serialization to enhance both the efficiency and
security of the video synopsis process, providing a robust solution for
large-scale surveillance video analysis. The simulation results indicate that
the security module effectively extracts the watermark with a reasonable PSNR
and identifies any tube loss. Additionally, the analysis reveals that utilizing
a distributed system greatly enhances performance, which is critical for
real-time applications. Specifically, the proposed distributed model reduces
computation time by approximately 94% compared to sequential execution when
implemented across twenty machines. |
Keywords: |
Video Synopsis; Video Surveillance; Distributed Computing; Object-Based
Watermark; Tube Serialization |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
COMPARISON OF THE ENSEMBLE XGBOOST AND TRANSFORMER MODELS WITH MACHINE LEARNING
FOR CLASSIFICATION OF INDONESIAN MUSIC MOODS OF THE 70'S AND 80'S ERA |
Author: |
SARTIKA LINA MULANI SITIO, THOYYIBAH.T, MARYANI, MAULANA ARDHIANSYAH, NENY
ROSMAWARNI, YAN MITHA DJAKSANA, NUNIK DESTRIA ARIANTI |
Abstract: |
The aim of this research is to compare the Xgboost and transmormer machine
learning models. In the field of psychology, research on musical mood aims to
find out why humans have an emotional response to music. In the field of Music
Information Retrieval (MIR), research on the emotion and mood of music aims to
create music metadata to make it easier to manage and retrieve music as an
entity. The method used in this research is the CRISP-DM method with the stages
of business understanding, data understanding, data preparation, modeling,
evaluation and deployment. This research has only reached the evaluation stage.
This research used 1160 data using lyrics and audio data and used 31923
different words. This research uses a dataset of Indonesian songs from the 70's
and 80's. The labels used for mood classification are happy, sad and neutral.
The lyric representation used is word embedding and audio consisting of
Chromagram, Mel-Spectrogram. This research uses the Xgboost model and
transformer machine learning. With several analyzes used, the transformer model
accuracy achieved is 98%. |
Keywords: |
Indonesian Music, Xgboost, Transformer, Crisp-DM |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
EMPIRICAL INVESTIGATIONS TO PREDICT STOCK PRICE USING REGRESSION TO IMPROVE THE
TRENDS OF INORMATION TECHNOLOGY |
Author: |
G. RESHMA, SUDARSHAN TUMKUNTA, TGAYATHRI, T.V. HYMA LAKSHMI, N. NEELIMA,
GARIGIPATI RAMA KRISHNA, BORRA BHAVITHA |
Abstract: |
Accumulating riches by astute investment—which doesn't! In actuality, numerous
trading, financial, and even technological firms have been actively researching
stock market movements and stock price prediction. Machine learning techniques
have been used to generate a number of algorithms for stock price prediction.
Here, we'll concentrate on mastering a number of well-known regression methods,
such as support vector regression, regression tree, regression forest, and
linear regression, and using them to solve this billion-dollar problem. The
prediction's primary goal is to lessen the ambiguity surrounding investment
decision-making. Following the random walk, the stock market suggests that the
best forecast for tomorrow's value is today's value. Investors' beliefs are
impacted by the stock market indices' extreme volatility. Because of the
fundamental characteristics of the financial industry and, in part, the
combination of known and unknown factors, stock prices are thought to be highly
volatile and subject to abrupt fluctuations. There have been various attempts to
predict stock price utilizing Machine Learning. Each research endeavour has a
very different focus in three different ways. Target price changes might be
short-term (tomorrow to a few days later), long-term (months later), or
near-term (less than a minute). Less than ten specific companies, stocks in a
specific industry, or almost all stocks can be included in the collection. A
worldwide news and economic trend, specific firm attributes, or just time series
data of the stock price can all be employed as predictors. |
Keywords: |
Stock, Regression, Trends, IT, Price. |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
REVOLUTIONIZING INFORMATION RETRIEVAL: UNVEILING A NEXT-GENERATION AI-POWERED
QUESTION-ANSWER SYSTEM FOR COMPREHENSIVE DOCUMENT ANALYSIS |
Author: |
ZAHER NAJWA, GHAZOUANI MOHAMED, CHAFIQ NADIA |
Abstract: |
Following the introduction of a new public procurement decree in Morocco, public
institutions and businesses have faced challenges in comprehending the updated
rules for awarding contracts. The complexity of the new decree has made it
difficult for these entities to adapt to the changes, underscoring the necessity
for detailed guidance and training. To address these issues, this article
suggests creating an advanced system to analyze documents related to the new
law, aiding employees by simplifying access to crucial information. Utilizing
state-of-the-art text analysis and natural language processing, the proposed
tool aims to enhance understanding and compliance with the decree by making
legal information more accessible and easier to navigate. Different techniques
were compared, namely, spaCy, Langchain, GloVe, BERT, and OpenAI Embeddings. In
conclusion, our experiment has demonstrated that Langchain and OpenAI Embeddings
surpassed the other techniques in terms of performance. Specifically,
Langchain's specialized approach to text splitting proved to be exceptionally
efficient in preprocessing documents for analysis, allowing for more nuanced
segmentation that facilitated deeper understanding in subsequent processing
stages. Similarly, OpenAI Embeddings offered superior capabilities in capturing
the semantic richness of text, enabling our system to achieve higher accuracy
and relevance in its responses. |
Keywords: |
Question-answer system, OpenAI, Langchain |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
MODEL OF UTAUT 2 FOR THE ADOPTION OF BLOCKCHAIN TECHNOLOGY IN IMPROVING QUALITY
OF FINANCIAL REPORTING |
Author: |
DEVYLYA METRINESYA, FRISHELLA AMANDA YOKO, ANG SWAT LIN LINDAWATI |
Abstract: |
This study explores the adoption of blockchain technology to enhance the quality
of financial reporting by utilizing the UTAUT2 model to assess the factors
influencing its acceptance among internal accountants. Blockchain technology is
essential to improve transparency, security, and accuracy in financial data.
Despite the potential benefits of blockchain technology, the complexity of its
application in financial records still requires further understanding. By
applying a quantitative approach with data collected using questionnaires
distributed to industry experts in the financial sector. The key findings reveal
that most of the UTAUT2 factors have a positive significant effect on blockchain
adoption, except for social influence that has a significant negative effect,
along with hedonic motivation and price value do not have a significant
influence on blockchain adoption. The conclusions of the study highlight the
importance of technology-driven infrastructure development in financial
reporting to present relevant, verifiable, comparable, understandable, and
reliable financial statements. The rapid dynamics of the development of
blockchain technology in various industries, including the financial sector, may
limit the relevance of the findings in this study over time. Therefore, this
study concludes by recommending exploring higher adoption by expanding the scope
of future research. |
Keywords: |
Blockchain, Technology, Accounting, Financial Reporting, Distributed
Ledger, UTAUT2 |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
SECURE AND EFFICIENT MEDICAL IMAGE ENCRYPTION FOR MEDICAL CYBER PHYSICAL SYSTEM
USING CHAOTIC WITH DNA SEQUENCE |
Author: |
KAILASAM SELVARAJ, KARTHEEBAN KAMATCHI |
Abstract: |
In contrast to the conventional healthcare industry, Medical Cyber-Physical
Systems increasingly utilize smart sensor technology and health app to ensure
enhanced healthcare. Medical imaging has become essential in recent years for
disease diagnosis. These images include X-rays, ultrasound, and the brain which
contains private and delicate information. There are many difficulties with the
protected sharing and storing of medical images. The protection of medical
information is increased through the use of encryption. The research recommends
protected and lightweight medical image encryption using chaotic with DNA
sequences for a medical cyber-physical system. First, the image is divided into
blocks, rotated based on local pixel intensity variance, and shuffled using
random permutation. The shuffled blocks are then reassembled, and a bitwise XOR
operation completes the encryption, with a key generated from chaotic logistic
maps and DNA symbols. Analytical evaluations show the proposed method
outperforms earlier techniques, offering stronger security for medical data. |
Keywords: |
MCPS, Medical Image, Encryption, Chaotic, DNA, Xor operation, image blocks,
image rotation, imaging, Measure and Integration |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
UTILITY LIST-BASED MINING AND RECURRENT NEURAL NETWORK FOR UTILITY ITEMSET
MINING BASED TRANSACTIONAL DATA |
Author: |
G.N. SOWJANYA, M. BABU REDDY |
Abstract: |
The study of Frequent Itemset Mining (FIM) and High-Utility Itemset Mining
(HUIM) is essential because it provides practical insights to improve company
outcomes and explains customer behavior. To detect high-utility itemsets,
including those with negative utilities, HUIM algorithms have been efficiently
developed. Utility Itemset Mining (UIM) has changed into an important area of
research aimed at identifying high-utility patterns in transactional databases,
where item utility is measured using criteria such as profit, frequency, or
relevance. This research introduces a new framework that associations Utility
List-Based Mining (ULBM) with Recurrent Neural Networks (RNNs) for effective and
scalable utility itemset mining in transactional data. The utility list
organization stores significant utility data for each itemset, enabling direct
and effective calculation of utility values without the need for candidate
generation. This considerably reduces recollection usage and processing costs.
To enhance prediction abilities and capture temporal relationships amongst
transactions, we use an RNN, specifically a Long Short-Term Memory (LSTM)
network, to detect sequential patterns in transactional data. This hybrid
technique leverages the utility list's efficiency in performing utility
computations while the RNN learns temporal correlations between itemsets and
predicts high-utility itemsets across transaction sequences. Our experiments
validate that the proposed method, RNN-UBLM, not only outperforms existing
utility mining algorithms in terms of accuracy and effectiveness but also excels
at capturing dynamic, time-varying utility patterns. This approach is
particularly well-suited for requests in e-commerce, retail analytics, and other
sectors where utility-based decision-making is critical. |
Keywords: |
Frequent Itemset Mining, High-Utility Itemset Mining, Utility List-Based Mining,
Transactional data, Recurrent Neural Networks. |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
INSURECHAIN: A BLOCKCHAIN-BASED SYSTEM FOR SECURE, EFFICIENT AND INTEROPERABLE
INSURTECH |
Author: |
EMAN YASER DARAGHMI, HAZIM HARB, YOUSEF AWWAD DARAGHMI |
Abstract: |
Insurance Technology (InsurTech) solutions aim to transform traditional
insurance models to more personalized services, streamlined claims processing
and faster delivery. However, records in InsurTech are fragmented and isolated,
rather than interoperable and cohesive. The need for multiple access to
insurance records had raised the interoperability challenges between clients and
providers, which pose additional barriers to effective data sharing.
Additionally, insurance data face increasing threats since several advanced
techniques have been developed to violate digital privacy and security.
Therefore, Blockchain, which is a distributed trusted immutable database
solution and ledger technology, can solve these problems. Although several
studies were proposed to employ the Blockchain for managing insurance records,
there is still a need for more research to better understand, characterize and
evaluate its utility in InsurTech systems. This paper proposes a
Blockchain-based InsureTec system called InsureChain that provides
interoperable, secure, and efficient access to the records by providers, clients
and third parties while maintaining privacy. The InsureChain employs smart
contracts for effectively managing the claims of clients, governing
transactions, and monitoring computations through the enforcement of acceptable
usage policies. For better efficiency, the InsureChain uses Proof of Authority
(PoA) and an incentive mechanism that leverages the degree of provider’s nodes
from the perspective of InsurTech systems by measuring their efforts regarding
maintaining records and creating new blocks. The system was evaluated by
reviewing privacy, integrity communication channels, and security and comparing
consensus algorithms. The results show that the system achieves high privacy,
integrity, and secure communication. The PoA outperformers the proof of work and
proof of stake. |
Keywords: |
Insurance, InsurTech, Privacy, Confidentiality, Cybercrime, Blockchain, smart
contracts |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
Title: |
PREDICTION OF POWER OUTPUT ON EXTERNAL COMBUSTION ENGINE USING REGRESSION MODELS |
Author: |
RIZQI FITRI NARYANTO, MERA KARTIKA DELIMAYANTI, RIZKY ADI, ABDURRAHMAN, PUTRI
KHOIRIN NASHIROH, IMAM SUKOCO, FIQRI FADILLAH FAHMI, DAMAI YUDHA AKBAR EFFENDI,
AFRILZA DAFFA NARYAPRAMONO |
Abstract: |
This study explores the application of machine learning regression models to
predict power output in External Combustion Engine on Combined Cycle Power
Plants (CCPPs) using a comprehensive dataset of 9,568 hourly observations from
2006 to 2011. Key ambient variables include temperature, pressure, humidity, and
vacuum. To prevent overfitting, a 5x2 fold cross-validation strategy is
employed, generating 10 unique training and testing sets. Several models are
assessed, including Random Forest, XGB Regressor, Extra Trees, Hist Gradient
Boosting, and LGBM Regressor. XGB Regressor demonstrates superior performance
with a Mean Absolute Error (MAE) of 2.41 and Root Mean Squared Error (RMSE) of
3.37, making it the most accurate model. Additionally, the performance of
ensemble models further highlights their reliability in predicting power output.
The study emphasizes the importance of advanced machine learning techniques in
optimizing power predictions, balancing computational efficiency, accuracy, and
interpretability for large-scale industrial applications. Boosting Regressor
provides a more equitable compromise between computational efficiency and
performance, rendering it well-suited for implementations on a large scale.
Furthermore, despite its marginally diminished accuracy, the Random Forest
Regressor offers significant insights via the feature importance analysis,
thereby augmenting interpretability. This study underscores the significance of
sophisticated machine learning models in enhancing the precision and
effectiveness of power output forecasts in CCPPs. It stresses balancing
interpretability, computational cost, and accuracy in real-world applications. |
Keywords: |
Cross-Validation, External Combustion Engine, Machine Learning Models, Power
Output, Performance Metrics |
Source: |
Journal of Theoretical and Applied Information Technology
31th December 2024 -- Vol. 102. No. 24-- 2024 |
Full
Text |
|
|
|