|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
December 2015 | Vol. 82 No.1 |
Title: |
THE APPLYING OF THE HARDWARE-BASED RECONFIGURATION FOR AUTONOMOUS CONTROL
SYSTEMS OF SPACE MOBILE ROBOTS |
Author: |
VALERY DMITRIEVICH IVCHENKO, PETR GERMANOVICH KRUG, MAXIM VYACHESLAVOVICH
KURAKOV, EKATERINA NIKOLAEVNA MATYUKHINA, SERGEY ALEKSANDROVICH PAVELYEV |
Abstract: |
The problem of applying of hardware-based reconfiguration for autonomous control
systems of space mobile robots is concerned. The challenges associated with
using space mobile robots for planetary exploration missions are described and a
set of requirements on the design of such space-based robots is imposed. The
functional structure of hardware-reconfigurable digital module for intellectual
control of mobile space-based robots is proposed and further interaction between
its functional modules and remote support center in different situations
requiring reconfiguration is concerned. The procedures of self-check and
self-testing of the hardware-reconfigurable digital module for intellectual
control of mobile space-based robots are described, which are necessary to
ensure reliability of reconfiguration. The algorithms for self-testing of the
hardware of the digital control module are researched, taking the advantage of
on-line partial reconfiguration ability of FPGA. The means to achieve optimal
testing coverage while minimizing the amount of additional testing hardware and
testing time are considered. |
Keywords: |
Space-Based Robots, Remote Modification, Mobile Robot, Reconfigurable Computing,
Field-Programmable Gate Array (FPGA) |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
EVALUATION OF COMPLEXITY OF THE LARGE-BLOCK CLOUD COMPUTING USING ARITHMETIC
WITH ENHANCED ACCURACY |
Author: |
VLADIMIR EFIMOVICH PODOLSKIY, SERGEY STEPANOVICH TOLSTYKH, ANTON MIKHAILOVICH
BABICHEV, SVETLANA GERMANOVNA TOLSTYKH |
Abstract: |
In the article, the methodological questions are considered of the complexity
evaluation of large-block cloud computing with enhanced accuracy. The objective
of our work is solving in the cloud of the mathematical simulation problems with
special requirements for accuracy. In particular, we consider obtaining a
precise and trustworthy solution for the problems with complex connections
between sub-problems in the form of large blocks and the computation time
considerably exceeding the time of information transmission between them. A
methodology is proposed of the complexity evaluation for such problems used for
creation of the optimal productivity computing systems, operating in a cloudy
environment. |
Keywords: |
Large-Block Parallel Computing, Cloud Computing, Precision Computation, Precise
And Trustworthy Solution Of Problems, Floating Point Arithmetic With Enhanced
Accuracy. |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
NON-DETERMINISM REDUCING METHOD FOR OWL SHOIN CONCEPT CONSISTENCY CHECKING |
Author: |
ANDREY VIKTOROVICH GRIGORYEV, ALEXANDER GRIGORYEVICH IVASHKO |
Abstract: |
Description logics the widely used tool for the knowledge base representation.
The main task of this formalism is concept consistency checking problem and it
is solved by tableau algorithm. However, complexity of the tableau algorithm is
NEXPTIME and there are many tries to develop optimization for this algorithm.
This paper tells about probability determining method of two concepts
conjunction consistency, which was described in SHOIN DL. The developed method
of coherence determination uses a Kruskal’s method for segmentation. Based on
the developed probability determination method the method of reducing choice
non-determination in case of merging individuals executed in case of solving
rule "n>=" was created. For testing of the developed method a module, which
implements presented techniques was developed and integrated to TReasoner
system. Computer experiment was made for efficiency evaluation of the developed
methods and algorithms. Results shows that count of performed operations on «n
>=» rule expansion reduces to 28%. Also article presents the theoretical
explanation of the advantage of the presented method. |
Keywords: |
Description Logics; Consistency Checking; Rule Expansion; Kruskal’s Algorithm |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
AN ANALYSIS OF TECHNOLOGIES FOR BUILDING INFORMATION SECURITY INFRASTRUCTURE OF
GLOBAL DISTRIBUTED COMPUTING SYSTEMS |
Author: |
PAVEL SERGEYEVICH PTITSYN, DMITRY VLADIMIROVICH RADKO |
Abstract: |
The implementation of global distributed information systems based on cloud and
grid approaches. There are many problems of ensuring a high level of information
security of these systems because they operate critical or confidential data,
and the elements of these systems found in different physical locations to
communicate with that uses open standards and protocols of the Internet. The
existing distributed information systems implemented using a variety of
architectural and technology platforms, which usually do not meet the current
challenges in the field of ensuring a high level of information security. In
addition, the actual question of the integration of these systems with corporate
information systems, and providing a high level of security used integration
solutions. The aim of this work was the systematization and analysis of proven
technologies for building high reliable information security infrastructure of
global distributed computing systems. As part of the work identified approaches
to implementation of information security infrastructure based on technical
standards Globus Toolkit, OGSA, UNICORE, gLite. The features of the
implementation of security infrastructure based on these standards, as well as
the possibility to interact with external systems based on industry standards
such as SOA, Web Services. |
Keywords: |
Distributed Computing Systems, Security Framework, Security Infrastructure |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
THE METHODS OF THE SYNTHESIS OF THE NONLINEAR REGULATORS FOR THE DISTRIBUTED
ONE-DIMENSION CONTROL OBJECTS |
Author: |
YURY VOLEREVICH ILYUSHIN, DMITRY ANATOLYEVICH PERVUKHIN, OLGA VLADIMIROVNA
AFANASIEVA, MIKHAIL PETROVICH AFANASYEV, SERGEY VIKTOROVICH KOLESNICHENKO |
Abstract: |
In this paper the method of synthesis of distributed regulator, uniform control
object is regarded, which is based on the specified inaccuracy. The function of
the initial heating has been received; the mathematical modeling of the process
has been conducted as well as the analysis of the results. With the help of the
obtained regulator hardware and software complex has been designed in the Pascal
programming language, the complex allows to simulate the behavior of the
temperature fields in the isotropic rod. The article presents the simulation of
the temperature process for different configurations of the system. Namely,
there is simulation with different number of impulsive heating sources with the
relay control principle. The practical results of these studies suggest the
possibility of building silicon carbide heating element, made in the form of an
isotropic rod. |
Keywords: |
Green's Function, Thermal Field, Discretization Interval, Management Object,
Analysis, Synthesis. |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
AN EFFICIENT STATE METRIC MEMORY MANAGEMENT METHOD FOR FLEXIBLE VITERBI DECODER |
Author: |
XU BANGJIAN, LIU ZONGLIN, YANG HUI, CHENG LING |
Abstract: |
Flexible Viterbi decoder becomes extremely important as a result of increasingly
modern wireless communication standards in SDR (Software Defined Radio) systems.
In order to support multi-standard service and area saving, a flexible Viterbi
decoder chip with cascaded ACS ( Add Compare Select ) unit is preferred. In such
a decoder chip, there is a big irregular addressing problem for temporary ACS
computing results storing. To solve this, a generalized efficient state metric
memory management method has been developed. Analysis shows the design is highly
flexible and efficient. |
Keywords: |
Viterbi Decoder, Multi-standard, Reconfigurable Architecture, Software Defined
Radio (SDR), Cascaded ACS |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
HYBRID MULTI USER DETECTION FOR DS-OPTICAL CDMA |
Author: |
ADIL EL KABLI, MY AHMED FAQIHI, AHMED HAMMOUCH |
Abstract: |
OCDMA multiplexing technique has gained increasing interest from researchers and
has made great progress due to its easy access and flexible network structure.
DS-OCDMA systems suffer however from many noises, the most serious is multiple
access interferences (MAI), in order to mitigate the effect of MAI, codes
offering high performances in term of auto-correlation and cross-correlation
like Optical Orthogonal Codes (OOC) or Prime Codes (PC) have been proposed but
as the entire burden of performances could not be placed on code design only,
numerous detection strategies have been developed to help increase the system
performances.
In this paper, we present a statistical based method that uses both conventional
detection and non linear detection in an efficient way. The results show that we
can reduce by 24\% up to 60\% the detection complexity while keeping the BER at
the same level. |
Keywords: |
Optical Orthogonal Code (OOC), Optical CDMA (OCDMA), Multiple Access
Interferences (MAI), Direct Sequence OCDMA (DS-OCDMA), Single User Detection. |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
NEW WAY OF PASSIVE RFID DEPLOYMENT FOR SMART GRID |
Author: |
ELMAKFALJI CHAIMAE, ROMADI RAHAL |
Abstract: |
This article focuses on RFID technology over Power line for smart grid using
exclusively passive tags, that operate in the low frequency bands 125 kHz and
high frequency bands 13.56 KHz , passive tags are the most common, least
expensive, and they have a virtually unlimited life. This solution for Smart
grid is secure; it presents a new way of automatic identification with low
voltage, low deployment costs and high reliability. This Smart RFID for Smart
Grid helps to monitor and control electricity consumption. It can be used in
many applications such as ticketing and Payment and charge energy for electric
vehicles in smart home appliances such as the smart refrigerators and
residential door keys. |
Keywords: |
RFID; PLC; Smart Grid; passive tags; Automatic identification. |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
COMPARATIVE ANALYSIS OF PATH LOSS ATTENUATION AT OUTDOOR FOR 1.8GHZ, 2.1GHZ IN
URBAN ENVIRONMENT |
Author: |
N.V.K.RAMESH, K. SARAT KUMAR, D.VENKATA RATNAM, , DR. MD. ALI HUSSAIN
Y.V.SAI JASWANTH P.SARAT CHAITANYA |
Abstract: |
We investigated the radio signal path attenuation behavior by conducting a
measurement survey in a GSM network, which is transmitting at 1.8GHz and 2.1GHz
band in the Vijayawada city, Andhra Pradesh, India. Initially the measured field
strength data collected at various locations from the base stations are used to
estimate the path loss. It has been observed that the path loss increases with
distance in this case. In this paper a detailed analysis for the calculation of
path loss by using Okumura Hata model and the Cost 231 Hata model. We calculated
the path loss data and compared with real time data obtained for both 1.8GHz and
2.1GHz in an urban environment by using the received signal strength (RSS) of
the base station with and without noise. Our experimental result shows that the
Okumura Hata model is one of the best models for calculation path loss at urban
environment. |
Keywords: |
Path Loss, Coverage Area, Base Tran’s Receiver Station (BTS), Mobile Station
(MS) Global System For Mobile Communication (GSM) |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
STUDY ON OPEN SOURCE LEARNING MANAGEMENT SYSTEMS: A SURVEY, PROFILE, AND
TAXONOMY |
Author: |
BELAL NAJEH ABDULLATEEF, NUR FAZIDAH ELIAS, HAZURA MOHAMED, AWS. A.
ZAIDAN, BILAL. B. ZAIDAN |
Abstract: |
Open Source Software (OSS) in the education field has often been recommended by
different researchers in literature. Despite the fact that there has not been
evidence, yet, of current OSS dominance over traditional methods, OSS use in
education has been steadily increasing and expanding covering new domains. The
reputation of the OSS learning system has a great deal of importance when it
comes to people interested in adopting a Learning Management System (LMS). At
any rate, choosing to start a new application or adapt and modify an existing
one is an important decision. The study aims to produce basic guidance towards
available OSS LMS, and their substitutes in the field of education. 23 different
alternatives were picked from the existing active OSS based on previously
published papers, the study also aims to produce a summary of the available
studies and guides available, finally the study is aiming to bridge the gaps in
the current literature while proposing a taxonomy of OSS LMS in 56 papers taken
from literature. |
Keywords: |
Open source software, Learning Management System, Evaluation, Adoption |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
DATA HIDING SECURITY USING BIT MATCHING-BASED STEGANOGRAPHY AND CRYPTOGRAPHY
WITHOUT CHANGE THE STEGO IMAGE QUALITY |
Author: |
ALAMSYAH, MUCH AZIZ MUSLIM, BUDI PRASETIYO |
Abstract: |
This research discussed about the data hiding information using steganography
and cryptography. New method are discussed to secure data without change the
quality of image as cover medium. Steganographic method is used by find the
similarity bit of the message with bit of the MSB (Most Significant Bit) image
cover. Finding of similarity process is done by divide and conquer method.The
results are bit indexposition, thenthenencrypted using cryptographic. In this
paper we using DES (Data Encryption Standard) algorithm. We use data information
as message, images, and key as an input. Then, we use our method to secure
message. The output is encrypted bit index which containt data hiding
information and can be used to secure the messages. To reconstruct the contents,
we require the same image and same key.
Outcomes of our method can be used to secure the data. The advantages of this
method are the capacity of stored data hiding of messages can be larger than the
image. The image quality will not change and the capacity of stored messages can
be larger than the image. Acoording to the research, both gray scale and
colorful images can be used as image cover, except the image contains 100% black
and 100% white. Bit matching process on image which have much variety of color
takes less time. The damage of messages due to the addition of “salt and pepper”
noise starts from 0.00049of MSE. |
Keywords: |
Bit Index, Bit Matching, Cryptography, Divide And Conquer, MSB, Steganography |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
THE EXTRACTION AND THE RECOGNITION OF FACIAL FEATURE STATE TO EMOTION
RECOGNITION BASED ON CERTAINTY FACTOR |
Author: |
I GEDE ARIS GUNADI, AGUS HARJOKO, RETANTYO WARDOYO, NEILA RAMDHANI |
Abstract: |
Psychologically, emotion is related to someone’s feeling in particular
condition. Some fields like: health, psychology, and police investigation need
the information of emotion recognition. Human’s emotion can be classified into
six types include happy, sad, angry, fear, disgusted, and normal.
Psychologically, there are some methods that can be used to emotion recognition,
like self-report analysis, automatic measure, startle response magnitude, FMRI
analysis (Functional Magnetic Response Imaging), and behavior response. Each of
those methods has their own advantage and disadvantage.
The aim of this research is to determine someone’s emotion in a video scene. The
video was decomposed into image frame and in each image frame was extracted into
feature (component) face, which include mouth, eyes, nose, and forehead. The
feature extraction was done by combining two methods based on the color and the
facial geometric figure. The state of each face feature related to AU’s (Action
Unit) face that used to emotion recognition. State recognition of mouth and eyes
can be seen based on the feature elongation, state on the forehead and nose are
known based on the wrinkle density. In the last part of emotion, recognition is
done with certainty factor method to determine the quality of each emotion, the
classification of actor’s emotion is determined based on the quality level of
maximum emotion. The results showed recognition of emotion in a single image,
the average accuracy of 77%, while in the video, the average accuracy of 76.6%. |
Keywords: |
Emotion recognition, facial feature extraction, eyes figure, mouth figure,
Certainty Factor. |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
IDENTIFYING INFORMATION QUALITY DIMENSIONS THAT AFFECT CUSTOMERS SATISFACTION OF
E-BANKING SERVICES |
Author: |
MOHANNAD MOUFEED AYYASH |
Abstract: |
Information quality has been evidenced to be significantly related with the use
of the system in prior empirical study particularly in the e-commerce systems.
However, contended that while the importance of information quality is
indisputable, it is a subject that is under-explored. Therefore, this study is
an attempt to determine the dimensions of information quality that influences
customer satisfaction of services provided by e-banking, and to propose an
information quality model for customer satisfaction. The study proposed a model
on the basis of the information system success model and based on such model,
the researcher highlighted general information quality dimensions that influence
customer satisfaction of services provided by e-banking. |
Keywords: |
Information Quality Dimensions, Customers Satisfaction, E-Banking Services. |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
SELECTION OF CREATIVE INDUSTRY SECTOR ICT SUITABLE DEVELOPED IN PESANTREN USING
FUZZY - AHP |
Author: |
HOZAIRI, AHMAD |
Abstract: |
Selection of the type of creative industries field of Information and
Communication Technology (ICT) suitable to be developed in Pesantren is a
complex problem, this is caused by some kind of creative industries fields ICT
one of which must be chosen, whereas every kind of creative industries contain
several criteria that must be assessed in conformity with the priority location
will be developed. because with a situation that is complex and uncertain, so
that decision makers difficulty in determining the decision, usually makers
typically use intuition and subjectivity alone. approach Fuzzy -Analytic
Hierarchy Process (Fuzzy-AHP) is one method that can answer this question.
Because this method can lead decision makers to assess each criterion and
alternative determined. Fuzzy numbers are used to present the assessment, with
this approach, the decisions that are selected will be more accurate and
reliable.This study has four (4) criteria, Twelve (12) sub-criteria and four (4)
decision alternatives.The criteria used in this study were (E) = Economy, (T) =
Technology, (S) = Human Resources (HR) and (Markets. Alternative decisions to be
selected is (A) = Advertising, (F) = Fashion, (M) = Music and Photography.
Results of the simulation method of the F-AHP weighting the results obtained
alternative creative industries on each criteria and sub-criteria as follows:
[1] Advertising = 0.299, [2] Fashion = 0.284, [3] Photography = 0.252 and [4]
Musik = 0.207 so that the creative industries field ICT suitable to be developed
in Pesantren is a kind of advertising. |
Keywords: |
Creative Industry, Pesantren, F-AHP |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
HCMX: AN EFFICIENT HYBRID CLUSTERING APPROACH FOR MULTI-VERSION XML DOCUMENTS |
Author: |
VIJAY SONAWANE , D.RAJESWARA.RAO |
Abstract: |
In order to retrieve useful information from large number of growing XML
documents on the web, effective management of XML document is essential. One
solution is to cluster XML documents to find knowledge that promote effective
information management and maintenance. But in the real world XML documents are
dynamic in nature. In contrast to static XML documents, changes from one version
of XML document to another version cannot be predicted. So clustering technique
of static XML documents cannot be used to cluster multiple versions of XML
documents. In case of multiversion XML documents, preliminary clustering
solution is not become valid after document versions appear. XML documents are
self descriptive in nature, which results in large document size. To find new
clustering solution after change, comparisions between all documents is not
viable solution. In this paper we have proposed hybrid clustering approach to
cluster multiversion XML documents. This approach improves speed of clustering
by limiting the growing size of XML documents by using homo-morphic compression
scheme and using distance information from preliminary clustering solution with
the changes recorded in compressed delta |
Keywords: |
HCMX, Hybrid Clustering, Cluster re-evaluation, Multiversion, PCP, CSRP,
compressed Delta. |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
RECOGNIZING THE 2D FACE USING EIGEN FACES BY PCA |
Author: |
M. JASMINE PEMEENA PRIYADARSINI , K.MURUGESAN , SRINIVASA RAO INABATHINI ,G.K
RAJINI4, DINESH , AJAY KUMAR. R |
Abstract: |
Face Recognition is the process of identification of a person by their facial
image. This technique makes it possible to use the facial images of a person to
authenticate him into a secure system, for criminal identification, for passport
verification etc. Face recognition approaches for still images can be broadly
categorized into holistic methods and feature based methods . Holistic methods
use the entire raw face image as an input, whereas feature based methods extract
local facial features and use their geometric and appearance properties. The
present thesis primarily focuses on Principal Component Analysis (PCA), for the
analysis and the implementation is done in software, MATLAB. This face
recognition system recognizes the faces where the pictures are taken by web-cam
or a digital camera are given as test database and these face images are then
checked with training image dataset based on descriptive features]. Descriptive
features are used to characterize images. It describes how to build a simple,
yet a complete face recognition ] system using Principal Component Analysis, a
Holistic approach. This method applies linear projection to the original image
space to achieve dimensionality reduction. The system functions by projecting
face images onto a feature space that spans the significant variations among
known face images. The significant features known as eigenfaces do not
necessarily correspond to features such as ears, eyes and noses. It provides for
the ability to learn and later recognize new faces in an unsupervised manner.
This method is found to be fast, relatively simple, and works well in a
constrained environment. |
Keywords: |
PCA, Eigen Faces, Eigenvalue, Eigenvector Face Space |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
A CUCKOO SEARCH BASED PAIRWISE STRATEGY FOR COMBINATORIAL TESTING PROBLEM |
Author: |
ABDULLAH B. NASSER, YAZAN A. SARIERA, ABDUL RAHMAN A. ALSEWARI, AND KAMAL Z.
ZAMLI |
Abstract: |
Combinatorial Testing (CT) is a sampling technique to generate test cases with a
focus on the behavior of interaction system's components with their
collaborators. Given its effectiveness to reveal faults, pairwise testing has
often been chosen to perform the required sampling of test cases. The main
concern for pairwise testing is to obtain the most optimal test sets (i.e.
pairwise dictates that every pair of input values is covered by a test case at
least once). This paper discusses the design and implementation a new pairwise
strategy based on Cuckoo Search, called Pairwise Cuckoo Search strategy (PairCS).
PairCS serves as our vehicle to investigate the usefulness of Cuckoo Search for
pairwise testing. |
Keywords: |
Pairwise testing, Cuckoo Search, Test suite Generator, Software Testing,
Combinatorial Testing Problem. |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
WORD SENSE DISAMBIGUATION BASED ON YAROWSKY APPROACH IN ENGLISH QURANIC
INFORMATION RETRIEVAL SYSTEM |
Author: |
OMAR JAMAL MOHAMED, SABRINA TIUN |
Abstract: |
Word sense disambiguation (WSD) is the process of eliminating ambiguity that
lies on some words by identifying the exact sense of a given word. In the
natural languages, many words could yield multiple meaning based on the context.
WSD aims to identify the most accurate sense for such cases. In particular, when
translating one language to another, there would be a possibility to tackle an
ambiguity among the translated words. Quran, which is the holy book for
approximately billion Muslims, has been originally written in Arabic language.
Apparently, when translating Quran to English language, several semantic issues
have been caught by researchers. Such issues lies on the ambiguity of words such
as ‘ليلا ونهارا’ and ‘يوم الحساب’, which are translated into ‘day and night’ and
‘judgment day’. Such ambiguity has to be eliminated by determining the exact
sense of the translated word. Several research efforts have been intended to
disambiguate the sense of translated Quran. However, the process of identifying
an appropriate method for WSD in translated Quran is still challenging task.
This is due to the complexity of Arabic morphology. Hence, this study aims to
propose an adaption for Yarowsky algorithm as a WSD method for Quranic
translation. In addition, this study aims to develop an IR prototype based on
the proposed adaption method in order to evaluate such method based on the
retrieval effectiveness. In fact, the dataset that has been used in this study
is a collection of Quranic content. Several pre-processing tasks have been
performed in order to eliminate the irrelevant data such as stop-words, numbers
and punctuation. Sequentially, two lists of senses for each ambiguity word will
be created with their context. This would be performed in order to let the
Yarowsky algorithm train on such example set. After that, a decision list will
be constructed by the Yarowsky algorithm, which depicts the labelling sense of
each word. The evaluation method that has been used in this study is the three
IR evaluation metrics; Precision, Recall and F-measure. The experimental results
have shown a 77% of f-measure. Such result seems to be weak in compared to the
results of Yarowsky that have been applied in open domain. This is due to the
lack of examples that could be extracted from Quran for both senses. Meanwhile,
such result seems to be competitive in WSD of Quranic translation. Finally, it
can be concluded that WSD has a significant impact on the IR system by improving
the retrieval effectiveness. |
Keywords: |
Word Sense Disambiguation, Yarowsky Algorithm, Information Retrieval, Natural
Language Processing, Quran |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
EMOTION DETECTION THROUGH SPEECH AND FACIAL EXPRESSIONS |
Author: |
KRISHNA MOHAN KUDIRI, ABAS MD SAID, M YUNUS NAYAN |
Abstract: |
Human-human communication in social environment is only possible through speech,
facial expressions and bodily changes. In this research work, speech and facial
expressions is used in order to estimate basic emotions (angry, sad, happy,
boredom, disgusting and surprise) using a computer. A new asynchronous hybrid
technique through Relative Bin Frequency Coefficients (RBFC) and Relative
Sub-Image Base (RSB) is used in order to deal with the above modalities which
are different in time, scale and dimension. Support Vector Machine (SVM) is used
for classification. From experimental results, the proposed system performs
better than the conventional systems. |
Keywords: |
Relative Bin Frequency Coefficient (RBFC), Relative Sub-Image Based Coefficients
(RSB), Human Computer Interaction (HCI), Support Vector Machine (SVM), Principal
Component Analysis (PCA). |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
Title: |
NTCCRT: A CONCURRENT CONSTRAINT FRAMEWORK FOR SOFT REAL-TIME MUSIC INTERACTION |
Author: |
MAURICIO TORO, CAMILO RUEDA, CARLOS AGÓN, GÉRARD ASSAYAG |
Abstract: |
Writing music interaction systems is not easy because their concurrent processes
usually access shared resources in a non-deterministic order, often leading to
unpredictable behavior. Using Pure Data (Pure Data) and Max/MSP, it is possible
to program concurrency; however, it is difficult to synchronize processes based
on multiple criteria. Process calculi such as the Non-deterministic Timed
Concurrent Constraint (ntcc) calculus, overcome that problem by representing,
declaratively, the synchronization of multiple criteria as constraints. In this
article, we propose the framework Ntccrt, as a new alternative to manage
concurrency in Pure Data and Max/MSP. Ntccrt is a real-time capable interpreter
for ntcc. Using Ntccrt binary plugins in Pure Data, we executed models for
machine improvisation and signal processing. We also analyzed two case studies:
one of a machine improvisation system and one of a signal processing system. We
found out that performance of both case studies is compatible with soft
real-time music interaction; it means, a musician can interact with Ntccrt
without noticeable delays during the interaction. |
Keywords: |
Concurrent Constraint Programming (ccp), Soft Real-Time, Machine Improvisation,
Signal Processing, Music Interaction, Computer Music, Process Calculi. |
Source: |
Journal of Theoretical and Applied Information Technology
10th December 2015 -- Vol. 82. No. 1 -- 2015 |
Full
Text |
|
|
|