|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of
Theoretical and Applied Information Technology
September 2022 | Vol. 100
No.17 |
Title: |
USER EXPERIENCE PERCEPTION OF ANDROID-BASED THERAPIST SPA RESERVATION
APPLICATION (A CASE STUDY OF BABY SPA APPLICATION USERS) |
Author: |
CARUDIN, UWAY WARIAH, ASEP JAMALUDIN, NELLY APRININGRUM, DADANG YUSUP |
Abstract: |
The increasing public demand for homecare baby spa services has made business
actors in this field start trying to compete online in providing the best
services and services. Efforts are made especially in the use of information
technology for the context of media marketing, one of which is using application
media. Applications that are currently believed to be able to reach a broad
market so that they require more attention, especially in terms of design and
content, this is because design and content can provide a user experience so
that they are interested in using the application. This study is to identify the
application of the User Experience (UX) concept to the baby spa therapist
reservation application, so that it can be a reference for future developers who
want to develop similar mobile applications. Statements for respondents are
based on 4 fundamental UX variables, namely Value, Adoptability, Desirability
and Usability which are made on a Likert Scale. Statistical methods used are
validity and reliability tests. Furthermore, the results of the questionnaire
are presented in the form of descriptive analysis based on the four UX
variables. Meanwhile, to find out the most important fundamental UX elements,
the "YES & NO" classification method is used. Based on the research results, it
is known that the user experience variables that are most considered important
by respondents are the value, adoptability and desirability variables. This is
because people use this baby spa application to reserve a baby spa therapist, it
is very easy to learn, has an attractive visual design and is easy to access. |
Keywords: |
Baby Spa, User Experience, Information Technology, Fundamental UX Variables |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
CHINESE CHARACTER RECOGNITION USING SUPPORT VECTOR MACHINE |
Author: |
ASLINA BAHARUM, ROZITA ISMAIL, SHALIZA HAYATI A. WAHAB, FARHANA DIANA DERIS,
NOORSIDI AIZUDDIN MAT NOOR, MOHD SHAREDUWAN MOHD KASIHMUDDIN |
Abstract: |
Optical character recognition is the art of scanning and detecting the word in
the images so that the machine can identify and classify the character. Chinese
characters are one of the world's most widely used writing systems. It is used
by more than one-quarter of the world’s population in daily communication.
Chinese characters can be considered difficult because they have many
categories, complex character structures, similarities between characters, and
various fonts or writing styles. There are many known machine learning
algorithms for character recognition, but not all can classify Chinese
characters with high speed and accuracy. Therefore, this paper proposes
recognizing Chinese characters using support vector machines. Support vector
machines are a classification of two classes widely used in classification. It
produces very accurate results for many classes, making it suitable for
recognizing Chinese characters. |
Keywords: |
Support Vector Machine, Optical Character Recognition (OCR), Character,
Classifier, Feature |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
EXPERIMENTAL INVESTIGATIONS TO AUTHENTICATION OF PERSON USING HAND RADIOGRAPHS |
Author: |
RAM GOPAL MUSUNURU, Dr. T SIVAPRAKASAM, Dr G KRISHNA KISHORE, Dr. R
THIRUVENGATANADHAN |
Abstract: |
Confirmation is the most common way of distinguishing client's character. The
personalities that are given by client are contrasted with those on a record in
a data set of the approved client's data on a neighborhood working framework or
inside a validation worker. Biometric radiographs are utilized in confirmation
of an individual during wrongdoing and debacle episodes. As of late,
confirmation and recognizable proof of an individual has turned into a
significant piece of the vast majority of the mechanization frameworks. The
different biometric distinguishing pieces of proof like unique finger
impression, iris, face, palm motifs neglect to perceive the human when the outer
biometric parts have been harmed because of epidemics, lesions, and serious
consuming. Securities, heartiness, protection, in addition non-imitation are the
basic parts of any individual confirmation framework. In such circumstances,
recognizable proof dependent on radiographs of the skull, hand, and teeth remain
viable substitution techniques. In our examination, an original legal hand
radiograph centered human validation is offered utilizing a profound neural
organization. We use convolutional profound neural organization (CNN) design
intended for the component taking out of hand radiographs and for
acknowledgment. As a piece of this we have taken 1400 unique kinds of hand
radiographs dependent on their age gatherings, callings, and sexes are thought
of. Our investigation uncovers that hand radiographs encompass biometric data
that jerry can be utilized in the direction of recognize people in misfortune
casualty ID. The trial concentrate likewise shows that the proposed approach is
altogether powerful than traditional strategies for the individual confirmation
utilizing hand radiographs. In this we use 4 layers, initially the input image
is passed through convolution layer and after processing it is passed through
relu layer to remove the negative values. The output from this layer is passed
to pooling layer where max/ average pooling be situated applied on it. And
finally passed to fully connected layer where the final classification be there
done and determines whether the hand radiograph are valid or invalid. |
Keywords: |
Authentication, Hand Radiographs, CNN, Investigation, Biometric |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
OPTICAL HEVC CRYPTOSYSTEM USING INPUT AND FRACTIONAL FOURIER PLANES RANDOM
ENCODING |
Author: |
MOHAMMED A. ALZAIN |
Abstract: |
This paper present an efficient optical High efficiency video coding (HEVC)
cryptosystem using input and Fractional Fourier (FrFT) planes random encoding.
The encryption of the proposed optical HEVC cryptosystem using input and FrFT
planes random encoding starts by separating each HECV plainvideo frame into RGB
channel components and every one of HEVC RGB channel components is modulated in
the input plane with the first random phase mask (RPM1). After that, the FrFT is
applied to each one of HEVC RGB channel components. Then, every one of modulated
HEVC RGB channel components in the FrFT is modulated again with the other RPM2
and the inverse of FrFT is employed. Finally, the HEVC RGB channel components
are merged to get the HECV ciphervideo frame. The decryption of the proposed
optical HEVC cryptosystem using input and FrFT planes random encoding starts by
separating each HECV ciphervideo frame into RGB channel components and every one
of HEVC ciphervideo frame RGB channel components is subjected to FrFT and
modulated in the FrFT plane using the conjugate of RPM2. After that, the inverse
of FrFT is applied to every one of modulated HEVC ciphervideo frame RGB channel
components and modulated in the input plane using the conjugate of RPM1.
Finally, the decrypted HEVC ciphervideo frame RGB channel components are
combined to get the deciphered HECV frame. The proposed the proposed optical
HEVC cryptosystem using input and FrFT planes random encoding is tested with
different security metrics like visual inspecting, statistical, cipher quality,
differential, and occluding tests. The outcome of tests ensures and confirms the
effectiveness of the proposed optical HEVC cryptosystem using input and FrFT
planes random encoding. |
Keywords: |
Optical Encryption, HEVC, Fractional Fourier Transform (FrFT) |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
EFFECTIVE DATA ALIGNMENT AND KEY EXTRACTION TECHNIQUES FOR GENERATION OF REGULAR
EXPRESSIONS USING MACHINE LEARNING TECHNIQUES |
Author: |
DINESH D. PURI, Dr. G. K. PATNAIK |
Abstract: |
Machine learning supervised classification plays a significant role in large
text classification. Health care data contribute to generating privacy and
security for a couple of years. Such electric records might take extensive data
from storage devices, so it needs to optimize with some processing techniques.
For generation of regular expression, data should be in structured form. The
conversion of unstructured data in to structured form is done through modified
data alignment and key extraction techniques. The Smith-Waterman method compares
two sequences to find comparable review statements. Smith-Waterman method
employs an evolutionary algorithm to find improved local alignments of pairs. It
can identify the best local alignment using the supplied score system. The NLP
process has been used to generate the keys from such large text. In this paper,
we proposed a sequence alignment generation using Smith-Waterman (SW) algorithm
and key extraction from generated sequence using Natural Language Processing
(NLP) technique has used for effective regular expression building. The
filtration techniques have been used to eliminate redundant features, and the
Machine Learning (ML) algorithm has been used as post-processing for
classification. The generated regular expression by the SW algorithm gives
better classification accuracy using NLP and Machine leavening. Generated
regular expressions are filtered using 100% precision as threshold. We applied
various Machine learning algorithms out of which SVM gives highest accuracy. |
Keywords: |
Sequence Generation, Sentence Alignment, Key Extraction, Regular Expression,
Classification, Smith Waterman Algorithm, NLP |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
HIGH SPEED DECODING BY COLLABORATION BETWEEN THE HARTMANN RUDOLPH AND
INFORMATION SET DECODING ALGORITHMS |
Author: |
HAMZA FAHAM, MY SEDDIQ EL KASMI ALAOUI, SAÏD NOUH, MOHAMED AZZOUAZI, ISSAM
ABDERRAHMANE JOUNDAN |
Abstract: |
Decoders are implemented to retrieve data after its transmission over a noisy
communication channel. Soft decision decoders are highly efficient in
concatenation schemes exploiting more than one decoding level. In our case, we
concatenated the symbol-by-symbol Hartmann Rudolph (HR) decoding algorithm and
the Information Set Decoding (ISD) technique that is a word-to-word decoding. In
this work, we will suggest to concatenate HR partially exploited (PHR) and the
ISD technique in order to decode linear block codes. We will use firstly the HR
decoder with a reduced number of dual codewords then the ISD, which uses the
output of PHR. We noticed that the suggested serial concatenation guarantees
very high performances with less dual codewords number. For instance for the
QR(31, 16, 7) code, the satisfying obtained results are based only on 2,74% of
the dual codewords. For the same code, we have minimized the runtime by 95%
compared to the use of HR alone. This proves the power and the speed of the
suggested concatenation. |
Keywords: |
Information Theory, Error Correcting Codes, Hartmann Rudolph (HR), Information
Set Decoding, PHR |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
CLASSIFICATION MODELS COMPARISON FOR PARKINSONS DISEASE DETECTION THROUGH SPEECH
FEATURES |
Author: |
CHRISTOPHER PUTRA SETIAWAN, EDWARD PRATAMA PUTRA, GEDE PUTRA KUSUMA |
Abstract: |
Prior researches have used speech features to classify Parkinson’s disease. This
research was conducted to compare thirteen classification models in order to
achieve a better performance in the newly published dataset. The method offered
in this paper involves the usage of hyperparameter tuning, feature selection
using genetic algorithm, and model stacking with ANN as the final classifier.
The final stacked model has achieved an accuracy of 90.13% in the test dataset,
with an F1- score of 93.7%. The results indicate that classification accuracy of
Parkinson’s disease classification through speech features can be increased by
utilizing hyperparameter tuning, feature selection, and ensemble stacking. This
improvement also indicated that the usage of feature selection in tandem with
ensemble stacking method can yield better result compared to prior state of the
art models for Parkinsons disease detection. |
Keywords: |
Parkinsons Disease Detection, Machine Learning, Ensemble Stacking,
Hyperparameter Tuning, Feature Selection |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
A PERFORMANCE OPTIMIZED REVERSIBLE CELLULAR AUTOMATA BASED SECURITY ALGORITHM
FOR SECURE DATA COMMUNICATION AND STORAGE IN HEALTHCARE |
Author: |
SURENDRA KUMAR NANDA, SUNEETA MOHANTY, PRASANT KUMAR PATTNAIK |
Abstract: |
Information security and protection of privacy is one of the critical tasks in
any field of work. It is more critical for the field of medical science data as
it involves life and critical healthcare hazards. Another aspect is the advances
in technology that gives high processing capabilities to the users and they can
use these high capabilities system to protect this information more effectively
and efficiently. The privacy of information should be ensured not only on data
stored in storage but also required during electronic communication from one
place to other. The reversible cellular automata and cryptographic hash
functions can be used to achieve security of data on rest and during the
transition with optimal performance. We used a 128 bits encryption algorithm
based on reversible cellular automation to protect information during transit
and 128 bits cryptographic hash function to protect data at rest. The
implementation results show these algorithms are immune to cryptanalytics and
extremely difficult to break it using brute force attack. The NIST statistical
results show good characteristics of this algorithm. The hardware implementation
of these algorithms is simple as it uses simple logic gates and hence cost of
implementation is cost-effective |
Keywords: |
Reversible Cellular Automata, Cryptographic Hash Function, Clinical Data
Security, Healthcare |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
A NOVEL APPROACH FOR CONTENT EQUIVALENCE ANALYSIS IN COMPRESSED DOCUMENT IMAGES:
A SYSTEMATIC STUDY |
Author: |
KAVITA V. HORADI, DR. JAGADEESH PUJARI, NARASIMHA PRASAD BHAT |
Abstract: |
Rapid growth of digital data with complex content has led to various challenges
in processing. Exponential increase in the size of ‘Big Data’ due to videos,
audios, images and textual content has created several problems which need to be
addressed by the research community. Currently, huge amount of digital data is
generated by various sources. The high quality data require more space and
consume excessive bandwidth during transmission. To overcome these issues,
digital data are stored in compressed form using different compression
algorithms stated in literature. In order to analyze these data traditional
schemes use decompression techniques which are a time consuming process and
increases the computation overhead. To overcome these issues, currently
compressed domain image processing techniques have been adopted where complete
decompression may not be required. In this work, we adopt document image
processing in compressed domain which contains printed text in the document
images. Our main aim is to identify the similarity and find the equivalence
between two or more compressed document images. In order to achieve this, first
of all, we apply JPEG encoding which generates encoded data. This data further
processed through the proposed line, word and character segmentation scheme.
Further, we apply SIFT (Scale-Invariant Feature Transform) to extract the
feature from compressed domain segmented data. Finally, feature matching scheme
is applied which uses Brute force feature matcher and k-nearest neighbor. We
have tested this approach on publically available PubLayNet, IIIT-AR-13K, and
Tobacco-3482 datasets which contains large scale document images. The
experimental analysis shows the robustness of proposed approach to identify the
similarity between compressed documents images. |
Keywords: |
JPEG; Compressed Document Images (CDI); Document processing; Content
Equivalence; Compressed Domain; SIFT; Brute force; KNN. |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
TELUGU TEXT SUMMARIZATION USING HISTO FUZZY C-MEANS AND MEDIAN SUPPORT BASED
GRASSHOPPER OPTIMIZATION ALGORITHM (MSGOA) |
Author: |
CHINNI BALA VIJAYA DURGA, DR. G.RAMA MOHAN BABU |
Abstract: |
In this work, we clearly stated the text summarization. Time is taken to read
the big document it's quite complex, at the same time if we summarize the same
document it is easy to understand and time-saving. Here we propose the text
summarization of the Telugu language. To begin, documents must go through a
preprocessing procedure that includes tokenization, stop-word removal, stemming
and N-gram analysis. After that, it uses Histo-Fuzzy C-means Clustering to
achieve clustering, as well as a technique of sentence ranking based on weights.
Finally, the Median Support Based Grasshopper Optimization Algorithm (MSGOA) is
utilized to combine the phrases into a clear and succinct summary. The
performance of this strategy is evaluated using an online research dataset. When
compared to earlier text summarizing methods, the suggested method outperforms
them. When compared to existing accuracy, the proposed method performs admirably
and obtains an accuracy of 84%. |
Keywords: |
Telugu language Text Summarization, preprocessing, Median Support Based
Grasshopper Optimization (MSGO), Histo-Fuzzy C-Means Clustering, Stemming,
Enthalpy |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
DESIGN OF A CONCEPTUAL FRAMEWORK FOR A GAMIFIED SOCIAL MEDIA APPLICATION |
Author: |
OHWO ONOME BLAISE, SEUN EBIESUWA, ADESINA ADIO, ADEGBENJO ADERONKE, AMUSA
AFOLARIN IBRAHIM |
Abstract: |
The goal of gamification is the deployment of game mechanics in real-world
context for non-gaming scenarios, to drive motivation and performance regarding
a given activity. Preceding research, mostly supports the hypothesis fundamental
to this aim. Till date, gamification has continued to find its application in
various industries and sectors. Among application area where gamification has
found its usage are in social networks. Due to the commercialization of the
mobile devices and user content generation, there has been the emergence of
mobile social networking, which has changed the way the users relate to
applications. In the search for better ways to improve user experience in mobile
social networking, gamification (the use of game design elements in non-related
games) can play a vital role. And considering there are many different game
mechanics which can result in diverse application services. Various game
mechanics were reviewed to understand the working principles and where they can
be applied. Based on this review, a conceptual framework for gamified social
media application was presented. The game mechanics configuration was
deliberately varied, in regards to their impact on the completion of basic
functionalities as well as potentially improve user engagement. |
Keywords: |
Gamification, Game Mechanics, Conceptual Framework, Social Media, Social
Networking, User Engagement, Motivation |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
IWVTSA: IMPROVED WORDS VECTORS FOR TWITTER SENTIMENTS ANALYSIS |
Author: |
AMINA SAMIH, ABDERRAHIM GHADI, ABDELHADI FENNAN |
Abstract: |
Twitter sentiment analysis technology provides methods for polling public
opinion on events or products. The majority of current research is aimed at
obtaining sentiment features by analyzing lexical and syntactic features. Word
embeddings express these characteristics explicitly. This paper proposes a new
approach called Improved Words vectors for Twitter sentiments analysis (IWVTSA)
to improve the f1 score of twitter sentiment analysis. We introduce a word
embeddings method called tweet2vec, which was launched concurrently with
doc2vec, to form a sentiment feature set of tweets; these word embeddings are
combined with word sentiment polarity score features. For training and
predicting sentiment classification labels, the feature set is integrated into
four machine learning classifiers (XGBoost, SVM, Logistic Regression, Random
Forest). We compare the quality of our model to that of the baseline model, the
bag of words algorithm, and the results show that the combination of tweet2vec
and Polarized lexicon in our model performs better on the F1-measure for Twitter
sentiment classification. |
Keywords: |
Sentiment Analysis, Tweet2vec, Doc2vec, Polarized Lexicon, Machine learning. |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
DEEP LEARNING BASED SOFTWARE RELIABILITY PREDICTION MODEL FOR COMPONENT BASED
SOFTWARE |
Author: |
SHIVANI YADAV, BALKISHAN |
Abstract: |
Component Based Software Engineering (CBSE) is a software development approach
used for rapid development. This involves the use of existing, mature
components, which might have been developed for other software or product, along
with new components to speed up the software development process. Component
based development is being increasingly adopted to reduce costs, manage the
increasing complexity of software products, and faster time to market. A major
challenge posed by the adoption of Component Based Software (CBS) is to ensure
the reliability of software products developed using this approach. The
reliability of CBS is dependent upon the individual component reliability as
well as software architecture. In this paper, a deep learning-based model is
designed to predict the reliability of components. The results show that the
designed model is well predicting the reliability with high accuracy. Comparison
with other machine learning models justified the use of deep learning models in
reliability prediction. |
Keywords: |
CBS, Component, Deep Learning, Prediction, Reliability |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
DEEP CONVOLUTIONAL NEURAL NETWORK IMPLEMENTATIONS FOR EFFICIENT BRAIN STROKE
DETECTION USING MRI SCANS |
Author: |
KASHI SAI PRASAD, DR. S PASUPATHY |
Abstract: |
Advancements in Artificial Intelligence (AI) has paved way for solving the
problems in different domains. In healthcare industry, brain stroke incidence is
in alarming rate across the globe. According to WHO, it is one of the leading
causes of disability and depth which needs sustainable research effort. It is
believed that bringing AI into Clinical Decision Support System (CDSS) has
potential to have unprecedented strides in healthcare units for improving
Quality of Service (QoS). The existing MRI scans based research with deep
learning has limitations as it uses pre-defined CNN based models. The CNN models
such as VGG16, ResNet50 and DenseNet121 have their built in functionalities.
However, for a given problem, their performance may not be adequate and they are
to be optimized to meet requirements of the problem in hand. To overcome this
problem, in this paper, we proposed a framework and underlying mechanisms to
optimize and exploit the deep CNN models aforementioned for efficient brain
stroke detection. Apart from the improving the models, an algorithm named
Optimized Deep Learning for Brain Stroke Detection (ODL-BSD) is proposed. MRI
scans are used to evaluate the proposed framework. The empirical results
revealed that the proposed methodology has caused improvement in the performance
of the deep CNN models significantly. |
Keywords: |
Brain Stroke Detection, Deep Learning, Convolutional Neural Network,
Densenet121, Vgg16, Resnet50 |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
GENERIC FRAMEWORK FOR EARLY PREDICTION OF HEART DISEASE USING LEARNING METHOD |
Author: |
VENKATESWARARAOCHEEKATI, DR .D.NATARAJASIVAN , DR.S.INDRANEEL |
Abstract: |
The incidence of heart disease is growing at a startling rate, making
identification at an earlier stage all the more important. The identification of
heart illness is a challenging endeavour that calls for accuracy and dexterity
on the part of the diagnostician. To determine which people, based on a variety
of different medical criteria, are more likely to develop heart disease, the
purpose of this study is to identify those patients. Models for determining the
likelihood of a person being diagnosed with a cardiac disease have been
developed, and these models can be used to evaluate an individual's risk.SGD
Classifier, Additional Tree Classifier, Calibrated ClassifierCV, Gaussian
Mixture, Nearest Centroid, MultinomialNB, Logistic RegressionCV, Linear SVC,
Linear Discriminant Analysis, SGD Classifier, Calibrated ClassifierCV, Linear
SVC, Linear Discriminant Analysis, SGD Classifier, Calibrated ClassifierCV,
Quadratic Discriminant Analysis, GaussianNB, Random Forest Classifier,
ComplementNB, MLP Class; Gaussian Mixture; Nearest Centroid;, LGBM Classifier,
Ada Boost Classifier, A very helpful method was utilised in order to determine
how the model might be applied to improve the level of accuracy with which heart
disease can be predicted in any given individual. Using Deep Learning and Random
Forest Classifiers, which had a high degree of accuracy compared to previous
models, the suggested model was able to effectively predict a person's
likelihood of acquiring heart disease. The algorithm for predicting cardiac
disease that has been proposed would not only make medical care more effective
but would also reduce expenses. By making use of this knowledge, we are able to
improve our ability to anticipate who might develop heart disease in the future.
The model was constructed in Python, and the data was taken from the Kaggle
vault. |
Keywords: |
Disease of the Heart, Artificial Intelligence, Deep Learning, Prediction |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
TOWARDS A NEW CLASSIFICATION METHOD OF TEAM PERFORMANCE USING CLUSTERING AND
FEATURES REDUCTION TECHNIQUES |
Author: |
ZBAKH MOURAD, AKNIN NOURA, CHRAYAH MOHAMED, ELKADIRI KAMAL EDDINE |
Abstract: |
The human factor is becoming more and more decisive in making the company more
efficient and more competitive, improving the performance of their teams
represents a challenge for the Human Resources department, which is also
experiencing a profound digital transformation of data and its management [1].
Traditional Human Resources tools are no longer effective at managing
skills, On the one hand, the performance evaluation of these resources
objectively becomes more and more one of the very complex tasks of a manager,
especially with the mass of current data presented by new non-traditional
sources, on the other hand, the departure of key skills is a phenomenon that is
not predictable by current HRIS tools, the financial cost is high as well as the
technical loss of knowledge and know-how, and flexibility that this presents.
In this article, we will propose a new approach to the classification of teams
according to several performance indicators. This method is based on the K-means
algorithm to classify the members of a team, assessed against performance
indicators linked to some soft and technical skills. The result of this work
represents a decision support model for managers to develop a team adapted to
the overall mission, to adapt its management style to each cluster, and to
prepare future hires to compensate for the skill gaps of the team in place. |
Keywords: |
Human Resources Management, Key Performance Indicators, K-Means, Cluster
Algorithm |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
JAVA SOURCE CODE SIMILARITY DETECTION USING SIAMESE NETWORKS |
Author: |
DIVYA KUMARI TANKALA, Dr. T. VENUGOPAL, VIKAS B |
Abstract: |
Software plagiarism checkers can be important during coding competitions, to
review, evaluate, and rank the participants. For a problem statement, if the
number of submissions is relatively small then inspecting each code submission
and being able to determine whether they are similar to the existing code
segment or not is easy, but in the case of huge code submissions, it is very
difficult to determine the presence of code clones in the submitted code
snippet. Therefore, there is a need for plagiarism checkers to detect similar
clones. In existing studies, we could use various approaches to detect code
clones, but code clone detection using an abstract syntax tree is one of the
popular approaches. Our proposed approach based on AST is experimented on a
BigCloneBench dataset consisting of Java code fragments and implemented using
recursive neural networks. The siamese networks were used to detect similarities
between two code fragments and presented the influence of contrastive learning
in source code clone detection with high accuracy. This paper showcases the
improvement in precision, recall and F1-score at least with 5% compared to
existing approaches. |
Keywords: |
Code similarity, Plagiarism detection, Siamese networks, Recursive Neural
Networks, LSTM, Contrastive learning |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
OPTIMAL OMNIDIRECTIONAL OFFLOADING IN FEDERATED CLOUD-FOG SYSTEMS WITH COST
MINIMIZATION |
Author: |
SAUMYARANJAN DASH, RAJ KUMAR PARIDA, JYOTIPRAKASH DASH, ASIF UDDIN KHAN, SANTOSH
KUMAR SWAIN |
Abstract: |
Cloud computing technology has brought a significant impact on the field of
technology. Similarly, fog computing provides similar facilities with a lesser
coverage area but is closer to the user. In a federated environment, these
technologies can complement each other by offloading their request to improve
the services. In this paper, we propose a federated two-tier architecture
consisting of cloud and fog, where the cloud has a higher cost and fog has a
lower cost. In this architecture, omnidirectional offloading is considered in
which offloading can be done vertically and horizontally, such as Fog to Cloud,
Cloud to Fog, and Fog to Fog. We propose a capacity optimization problem in this
federated architecture to optimize the capacities to minimize the total cost. We
are using a modified simulated annealing algorithm to solve this optimization
problem. The results show the performance of our algorithm is nearly optimal and
better than the SA in terms of execution time, and our proposed architecture can
save more cost than other existing architectures. |
Keywords: |
Cloud, Fog, Federation, Offloading, Cost, Capacity |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
A MACHINE LEARNING BASED FRAMEWORK FOR BALANCING USER PRIVACY AND UTILITY WITH
MULTIPLE SENSITIVE ATTRIBUTES IN HEALTH CARE DATA PUBLISHING |
Author: |
JAYAPRADHA. J, PRAKASH. M |
Abstract: |
Due to the increase in the emergence of health care system information, patient
records are needed for analyzing certain diseases. It leads to the need for
privacy and several challenges in the current health care system.
Privacy-preserving data publishing is essential to protect the patient's record
from numerous attacks. The main aim of privacy-preserving data publishing is to
safeguard an individuals’ personal information, though it is made available for
various analysis purposes. Initially, the paper has analyzed different services
of an electronic health record, need for privacy and has proposed a novel
anonymization technique for electronic health records. The proposed approach
will overcome the following drawbacks i) generalization of all attributes which
significantly reduces the information ii) identity disclosure even with
adversaries having intense background knowledge and iii) the trade-off between
privacy and utility. Compared to the existing system the proposed approach
achieves a better result by using hierarchical id-based generalization approach.
Hence, the proposed approach helps significantly in protecting individuals’
information even though an intruder has intense background knowledge.
Additionally, it focuses on achieving balanced utility in the anonymized data by
avoiding over generalization. The proposed approach consists of four phases i)
vertical partitioning, ii) Quasi-identifier bucket(QB) anonymization, iii)
Sensitive attribute bucket(SAB) anonymization iv) Evaluation of classification
accuracy. As per the experimental result, the proposed approach achieves
improved data privacy and utility in privacy-preserving data publishing. The
experimental results are not evaluated using the utility metrics. However, the
results, evidently illustrate that the proposed algorithm achieves improved
accuracy than the standard utility aware data anonymization algorithm in health
care records. The proposed approach thwarts the identity disclosure and
attribute disclosure for the static data. |
Keywords: |
Anonymization, Health Care, Generalization, Privacy, Utility, Classification
Model. |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
MINIMIZE THE SPREADING OF FAKE NEWS AND FORMING FAKE USER GROUP |
Author: |
AKASH D. WAGHMARE, Dr. GIRISH. K. PATNAIK |
Abstract: |
Due to the widespread use of social media-based international news, confirmation
and identification has become a difficult undertaking. Most social networking
sites make it simple to obtain news from anywhere on the internet at any time,
but they also create a lot of fake news and information at the same time. As a
result, in such a situation, it is vital to establish whether the information
given is authentic, that is, whether it is fake or real. This causes people to
become perplexed and lose faith in social media. To address these issues, the
suggested blockchain-based false news detection using machine learning is
applied. In training and validation evolution, the suggested classification
algorithm was applied to detect bogus news. Another significant goal of this
project is to revoke the offenders who update the news that has already been
published or fake in blockchain framework. Even in a vulnerable context, the
blockchain-based decentralized peer-to-peer ecosystem was leveraged to protect
the published data. To produce successful training rules and verify the test
classifier, a variety of feature extraction and selection strategies were
applied. On the LIAR dataset, an exhaustive experimental investigation reveals
the prediction performance of detecting false news with hybrid classification
algorithm. The 94.6% accuracy obtained for fake news detection as well as
detection of fake users’ identification. The major outcome of this research it
reduces the frequency of spreading the fake news using blockchain majority
algorithm with trust building strategy for end user. |
Keywords: |
Fake news detection, Supervised machine learning, Blockchain, Peer to peer,
Majority voting, Consensus |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
MARKETING COMMUNICATION STRATEGY OF GOLD MINING COMPANY “PT ANTAM LOGAM MULIA”
THROUGH DIGITAL PLATFORM |
Author: |
OSYE MEIRIANA SORAYA, ULANI YUNUS, MAHAPUTRA DIMAS ADIPRADANA |
Abstract: |
Technological developments in the digital era have made social media a "BOOM" in
the 21st century. Because more than 3 billion people in the world use digital
platforms for social media applications, and it is estimated that the number
will continue to increase every year. The presence of social media has
dramatically changed the communication activities of all lines of people's lives
in the world. Not only is it used for chatting (or) communication purposes
between two individuals but social media can also be used as an excellent
marketing tool. No wonder companies use social media as a marketing
communication strategy. This study aims to determine the implementation of
marketing communication strategies through digital platforms implemented by PT
Antam Logam Mulia as one of the largest gold mining companies in Indonesia. This
study uses a qualitative descriptive methodology. Researchers conducted in-depth
interviews with several experts in Marketing Communication Strategy at PT Antam
Logam Mulia and technical implementers, as competent resource persons in
answering the problems to be studied. The results of the study reveal the
marketing communication strategy model through digital platforms (social media)
implemented by PT Antam Logam Mulia, that the main direction of the marketing
communication strategy is to increase sales, focus on the goals and target
market by using organic marketing and Ads marketing. This study has implications
for promoting digital aspects that do not only appear in the communication mix,
but the overall process of marketing communications carried out by PT Antam
Logam Mulia using digital platforms - social media to carry out marketing
activities such as disseminating information, influencing or persuading, and or
reminding target market for the company and its products to be willing to
accept, want to buy, and be loyal to the products offered by the company. |
Keywords: |
Marketing Communication, Digital Promotion, Social Media, Precious Metals Gold,
Social Networking |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
IMPROVING LIGHTWEIGHT AUTHENTICATION USING NEW TECHNIQUES FOR IOT |
Author: |
WALEED KAREEM AHMED, RANA SAAD MOHAMMED |
Abstract: |
The Internet of Things (IoT) is a new paradigm that uses an Internet to
communicate a wide range of physical objects with the cyber scientist. The
Internet of Things (IoT) is rapidly growing and would soon have a big influence
on our daily lives. While the growing amount of linked IoT gadgets makes our
lives easier, it actually puts our personal information at danger. For IoT
devices, radio frequency identification (RFID) aids in the automated
identification of connected devices. However, both privacy and security for RFID
tag-connected technologies are the key issues. The increasing security of radio
frequency identification (RFID) solutions for a variety of RFID applications
that require a centralized database expansion, as compared to a standard central
database, blockchain technologies are rapidly establishing itself with a new
decentralized and distributed alternative that offers improved data security ,
dependability, transparency, the immutability, and lower maintenance costs. RFID
is expected to play a major role in enabling identification technologies in the
Internet of Things due to its inherent benefits. However, because of its
connection with sensor technology, it may be used in the broad range of sectors.
On the other hand, one of the most challenging parts of developing a RFID system
appeared for being security. Authentication and privacy concerns are at the
heart of RFID security. Elliptic curve cryptosystem (ECC) related algorithms are
commonly regarded as the best option among PKC approaches due to their small key
sizes and effective calculations. Recently W.K. Ahmed et al . proposed a New
Lightweight BLOCKCHAIN and ECC-Based RFID Authentication Protocol for IOT. We
found the weaknesses of W.K. Ahmed et al protocol by computation cost high and
running time high. In order to solve these problems. In this paper, we introduce
Improving lightweight authentication using New techniques for IoT. Our protocol
uses techniques from them blockchain, ECC, Arnold map chaotic, and Markova
chain. We implemented our suggested programming using python language. after
comparing storage cost, communication cost, and computation cost with other
protocols, our protocol is more secure and performance efficient than the
existing RFID protocols and is well suited for practical applications. |
Keywords: |
ECC, Arnold Map Chaotic, Blockchain Technique, Authentication, Markov Chain |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
HARDWARE IMPLEMENTATION METHOD OF SECRET DATA SECURITY ON FPGA BASED ON ZIG-ZAG
MAP ENCRYPTION AND STEGANO ALGORITHMS |
Author: |
BAYU KUMORO YAKTI, SUNNY ARIEF SUDIRO, SARIFUDDIN MADENDA, AND SURYADI HARMANTO |
Abstract: |
The internet has grown so rapidly that most of the individuals prefer to use the
internet as the main medium for transferring data. Data protection is very
important especially when sending secret data from one place to another. This
paper proposes a zig-zag mapping transposition encryption-decryption algorithms
and Least Significant Bit (LSB) steganography algorithm for data security
improvement. The hardware implementation methods of these algorithms into Field
Programmable Gate Array (FPGA) for data protection in real-time communication
are also offered. The FPGA Intellectual Property core (IP core) resulting
employed minimal LUTs resources. The encryption and steganography algorithms
occupy 107 LUTs and take 1.821 ns for each 64 bits data processing. Whereas
steganalysis and decryption algorithms needed108 LUTs and 2.172 ns processing
time. |
Keywords: |
Encryption-decryption, LSB steganography, LUTs FPGA resources, Zig-zag Mapping
Transposition |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
OPTIMIZING SPEAKER RECOGNITION USING D-VECTOR AND X-VECTOR SPEECH EMBEDDINGS AND
DEEP LEARNING |
Author: |
KEVIN KURNIAWAN, AMALIA ZAHRA |
Abstract: |
In speaker recognition, a way to identify individual’s identity is to identify
the characteristics of the individual voice, which can be identified using
speaker’s speech embedding. In this paper, we propose a method using x-vector
and d-vector speech embedding feature extraction and ResNet50 model on
identifying user’s identity. This paper uses part of VoxCeleb2 dataset with
total train data of 12.207 utterances, total validation data of 3.618
utterances, and total testing data of 2.068 utterances from 552 speakers.
Research done in this paper shows ResNet50 model with x-vector achieved
accuracy, recall, precision, and f-1 score of 74.12%, 73.42%, 78.02%, and 0.72
respectively, whereas ResNet50 model with d-vector achieved accuracy, recall,
precision, and f-1 score of 36.17%, 35.77%, 36.86%, and 0.32 respectively,
whereas the current state-of-the-art model achieved accuracy, recall, precision,
and f-1 score of 10.97%, 10.61%, 6.64%, and 0.07 respectively. |
Keywords: |
Speaker Recognition, Text Independent Speaker Recognition, ResNet50 Model,
X-Vector |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
BAG OF VISUAL WORDS AND CNN APPROACHES FOR CONTENT-BASED IMAGE RETRIEVAL USING
HOG, GCH AND ORB FEATURES |
Author: |
K. BHARATHI, DR M CHANDRA MOHAN |
Abstract: |
Component measurements in digital photos are expanding, and identifying a
specific image based on substance from a large database might be difficult at
times. A content-based image retrieval (CBIR) method is suggested in this work
to extract a feature vector from an image and successfully retrieve
content-based pictures. This paper considers three types of image feature
descriptor extraction methods: Histogram Oriented Gradients (HOG), Global Colour
Histogram (GCH), and Oriented Fast and Rotated BRIEF (ORB). The image feature
vectors are kept in the picture database and matched with the testing data
feature vector for CBIR at the time of retrieval. In this paper, we aim to
present the feature selection based on HOG, GCH and ORB methods to extract
features perfectly in capturing the standard dataset CIFAR10 features. The
suggested work's execution is evaluated using a Bag of Visual Words, and CNN
classifiers. The proposed strategy experiments for different labels indexed
elastic search procedures and all cases showed good accuracy in retrieving the
correct image. |
Keywords: |
Information Retrieval, Query Image, Bag Of Visual Words, Elastic Search Engine,
Feature Descriptors |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
INVESTIGATING THE ADOPTION OF AN INNOVATION USING AN EXTENDED UTAUT MODEL: THE
CASE OF MOBILE LEARNING TECHNOLOGY |
Author: |
KHALED M. S. FAQIH |
Abstract: |
Covid-19 pandemic is unleashing unprecedented digital revolution in educational
systems, making mobile-enabled learning is unavoidable alternative option,
especially in developing countries whereby university students have no option
but to use smartphones for learning as desktop and laptop computers are becoming
less popular at homes. To achieve the objectives of this study, this study
extends the UTAUT model by incorporating three different mechanisms to enhance
the UTAUT environment to investigate mobile learning adoption and enrich both
theory and practice. The UTAUT theory was extended by incorporating three
exogenous variables (perceived compatibility, perceived image, and perceived
mobile anxiety), the endogenous variable of perceived innovativeness, and
service quality as a moderator. The empirical data was collected using a survey
questionnaire administered to higher education students in Jordan. The proposed
research model was tested with the use of WarpPLS using 202 useable
questionnaires. The results demonstrate that all hypotheses were found
statistically significant, indicating that all variables included in this study
play an important role in affecting the adoption process of mobile learning. The
findings reveal that the research model explains 53% of the variance in the
intention to adopt mobile learning. Theoretical contributions and practical
implications are discussed. |
Keywords: |
Higher Education, Mobile Learning, UTAUT, Service Quality, Mobile Anxiety,
Jordan |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
A POWER AWARE PROBABILISTIC CLUSTER HEAD SELECTION STRATEGY FOR IOT ENABLED
DEVICES ON AGICUTURAL APPLICATIONS FOR TYPESETTING |
Author: |
PUPPALA TIRUPATHI, NIRANJAN POLALA |
Abstract: |
IoT enabled devices are becoming highly popular in various domains of industrial
and domestic usages. The gain in the popularity is primarily due to the higher
adoptability to the situations, effective mobility, availability of the
applications for managing data collected from the IoT devices. Also, few of the
IoT implementations have demonstrated significant improvements for open IoT
stack deployment using Wireless Sensor Networks, which is again accelerated the
growth of adaptations. Nonetheless, the IoT enabled device implementations comes
with fundamental challenges due to group or cluster-based aggregations. The
primary challenge is low battery life due to the higher computational loads and
due to the proximity-based load distribution during cluster head-oriented
computations. This issue is more prominent for the IoT enabled device
deployments for agricultural purposes due to the huge size of the deployment
site. Number of research attempts aimed to solve this problem in the recent
times. Nevertheless, the existing solutions are highly criticized for higher
complexity for deployment and adaptations. Hence, this work proposes a novel
solution to balanced workload distribution using probabilistic and
regression-based analysis, which results into a highly energy efficient and time
efficient cluster head node selection strategy. The probabilistic selection
strategy is integrated and optimized with the regression driven analysis to
ensure highly random selection and at the same time highly effective selection
based on various parameters such as mobility, computational capacity, proximity,
and memory consistency. As a result, this strategy demonstrates nearly 92%
improvements for time efficiency and nearly 94% improvements for energy
efficiency compared with the parallel research outcomes. |
Keywords: |
Agro-IoT; Probability Distribution; Probabilistic Selection; Equivalency
Coefficient; Energy Efficient |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
SPACE-TIME TRELLIS CODE’S PERFORMANCE EVALUATION CONSIDERING BOTH FEED FORWARD
AND RECURSIVE STRUCTURE ACROSS WIRELESS CHANNELS |
Author: |
RAJKUMAR MYLSAMY, KARTHIKA JAYAPRAKASH, SURESHKUMAR ARUMUGAM |
Abstract: |
The emerging new technologies demands for reliable communication of data at high
data rates making wireless communication a challenging field. The major
limitations in wireless communications are power, bandwidth and fading.
Especially in the applications such as video broadcasting and image transmission
requires high data rate and bandwidth. In such applications Space Time Trellis
Code Modulation (STTCM) is used to improve the data rate, power efficiency,
coding gain and diversity gain by maximizing spatial diversity. The available
Space Time Trellis code Modulation technique uses MPSK STTC with different
states and non-iterative decoder .In this paper, we iteratively decode MPSK
STTCM with various states and with various number of receiving antennas. And the
performance of non-iterative decoder and iterative decoder is analyzed
individually and together. Based on the analysis some conclusion is obtained.
The non-iterative decoder (STTCM) uses feed forward encoder structure and
Viterbi algorithm. The iterative decoder (ST Turbo TCM) uses recursive encoder
structure and Log-Likelihood value. The BER performance of iterative decoder and
non-iterative decoder with 4-PSK and 8 PSK modulation is analyzed with various
numbers of states and with variety of antennas which receives it. Outcomes of
the simulation shows that the performance of the BER can be enhanced by raising
the number of receiving antennas and raising the number of states. And the
results also shows that the BER performance of Space Time Turbo Trellis codes
are better than Space Time Trellis Codes. |
Keywords: |
Space Time Trellis Code, Feed Forward Structure, Recursive Structure, Space Time
Turbo Trellis Code, BER Performance. |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
NSCT BASED PCA AND K-MEANS CLUSTERING BLOCK LEVEL APPROACH FOR SAR IMAGE
DE-NOISING |
Author: |
DAWAR HUSAIN, DR. MUNAUWER ALAM |
Abstract: |
Visual data are transmitted as the high quality digital images in the major
fields of communication in all of the modern applications. These images on
receiving after transmission are most of the times corrupted with noise. This
thesis focused on the work which works on the received image processing before
it is used for particular applications. We applied image denoising which
involves the manipulation of the Nonsubsampled Contourlet Transform (NSCT)
coefficients of noisy image data to produce a visually high standard denoised
image. This works consist of extensive reviews of the various parametric and non
parametric existing denoising algorithms based on statistical estimation
approach related to wavelet transforms connected processing approach and
contains analytical results of denoising under the effect of various noise at
different intensities .These different noise models includes additive and
multiplicative types distortions in images used. It includes Gaussian noise and
speckle noise. The denoising algorithm is application independent and giving a
very high speed performance with desired noise less image even in the presence
of high level distortion. Hence, it is not required to have prior knowledge
about the type of noise present in the image because of the adaptive nature of
the proposed denoising algorithm. |
Keywords: |
Image - Denoising, NSCT, Gaussian Noise , PCA ,K Mean Clustering. |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
IMPROVING SECURITY AND IMPERCEPTIBILITY USING MODIFIED LEAST SIGNIFICANT BIT AND
FERNET SYMMETRIC ENCRYPTION |
Author: |
EZRA KARUNA WIJAYA, RICO KUMALA, BENFANO SOEWITO |
Abstract: |
In our daily life in this digital era, information has become essential. Along
with the development of technology and the importance of information, digital
crimes such as theft of information also develop. One way to protect information
is with steganography. Steganography is a technique to insert secret data or
secret messages into an image. In this paper, we will modify the Least
Significant Bit method so that the secret message will only be entered into one
color bit and combine our proposed Least Significant Bit method with
cryptography to improve the security and quality of the holding image. The
cryptography technique that we will use is fernet symmetric encryption. In this
study, we use the Python programming language. From the results of our research,
the quality of the holding image increases for the better. We can express this
based on the average PSNR value for each image which increases by 0.1% and
decreases from the average MSE value, 2.1%. By implementing our proposed method
we are able to improve security and imperceptibility proved by the RGB histogram
which is very similar to the original images RGB histogram. |
Keywords: |
Steganography, Cryptography, Least Significant Bit, Fernet Symmetric Encryption,
PSNR, MSE |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
FORMATION OF INDIVIDUAL TRAJECTORIES OF GIFTEDNESS OF STUDENTS BASED ON THE
ANALYSIS OF LARGE DATA ARRAYS |
Author: |
ASSEL BEKTENOVA, NATALYA DENISSOVA, LEONID BOBROV |
Abstract: |
This article discusses the issue of identifying giftedness, that is, the
formation of a student's individual trajectories. The relevance of the problem
lies in the development of a new paradigm of the information model, which allows
to determine the degree of giftedness that meets the needs of digital education
and the formation of trajectories for the development of giftedness among
students. The article describes the methods of applying fuzzy logic to determine
the student's individual trajectories. To do this, in this paper, an attempt was
made to use the mathematical apparatus of fuzzy logic. The structure of the
database covers all levels of skills in terms of the main indicators of the
quality of education. The expert study was carried out on the basis of the
Nazarbayev Intellectual School of Chemistry and Biology in Ust-Kamenogorsk. The
results obtained made it possible to identify the multi-criteria formation of
the trajectories of a gifted student. |
Keywords: |
Giftedness, Methods For Determining Giftedness, Fuzzy Logic Model, Rules,
Individual Trajectory |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
AN IMPROVED DEEPFAKE DETECTION METHOD BASED ON CNNS |
Author: |
DAFENG GONG, YOGAN JAYA KUMAR, ONG SING GOH, CHOO YUN HUOY, ZI YE5, WANLE CHI |
Abstract: |
Today's image generation technology can generate high-quality face images, and
it isnt easy to recognize the authenticity of the generated images through human
eyes. This study aims to improve deepfake detection, a face swapping forgery, by
absorbing the advantages of deep learning technologies. This study generates a
unified and enhanced data set from multiple sources using spatial enhancement
technology to solve the problem of poor detection performance on cross-data
sets. Taking the advantages of Inception and ResNet networks, new deepfake
detection architecture composed of 20 network layers is proposed as the deepfake
detection model. To further improve the proposed model, hyperparameter values
are optimized. The experiment result shows that the proposed network
significantly enhanced over the mainstream methods, such as ResNeXt50,
ResNet101, XceptionNet, and VGG19, in terms of accuracy, loss value, AUC,
numbers of parameters, and FLOPs. Overall, the methods introduced in this study
can help to expand the data set, better detect deepfake contents, and
effectively optimize network models. |
Keywords: |
Face Swapping, Cross Data set, Deepfake Detection, Data Enhancement, Optimized
Hyperparameters |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
Title: |
MODIFICATIONS OF AN ENCRYPTED-BASED SQL MODELS FOR MULTILEVEL DATABASE |
Author: |
AHMED Y. MAHMOUD, MOHAMMED M. ABU-SAQER |
Abstract: |
Information security is becoming increasingly vital in a variety of
applications. Database applications are no exception, and protecting information
based on encryption plays a vital role in the protection of sensitive data.
Securing data in several disciplines is a critical concern. Encryption was
applied as a mechanism to secure data by database management systems and their
applications. A promising approach has been seen in adopting multilevel security
and encryption based on multilevel security. Recently, there has been an
adaptation of the structure query language to be used in the encryption based on
multilevel security; the used models are very time-consuming, especially for the
large size of the database. In this paper, we proposed modifications of the
models of select, update, and delete operations. The proposed modifications
enhanced the performance of the models of select, update, and delete operations.
The modification relies on the fact that we avoided repeating the decryption
process and replaced it with a single encryption process. The obtained results
show that the performances of the proposed modifications are better than the
original model. |
Keywords: |
Multilevel Database, Encryption, Decryption, SQL Models, System Protection |
Source: |
Journal of Theoretical and Applied Information Technology
15th September 2022 -- Vol. 100. No. 17-- 2022 |
Full
Text |
|
|
|