|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
April 2015 | Vol. 74 No.2 |
Title: |
ONTOLOGY MATCHING: IN SEARCH OF CHALLENGES AHEAD |
Author: |
FAROOQUI N.K, Dr. MOHAMMED FAUZAN NOORDIN, ABDULHAFEEZ MUHAMMAD |
Abstract: |
This paper presents key features and challenges for the development of the next
generation ontology matcher. Matching elements of two data instances plays an
important role in e-business, multilingual data instances, biomedical and open
data cloud. This paper elaborates technologies, tools, algorithms and methods
used by recent ontology matchers. Ontology matching has become a Meta Research
field where research topics such as developing advanced reasoner, algorithms for
optimum matching, working on Meta matchers and improving results are major open
research areas. In order to find the future direction towards the development of
optimum matchers we illustrated a list of future challenges, key features, and
their importance. This paper does not propose solution or framework for an
optimum matcher. Reader will get help in deciding next ontology matching
techniques in his domain of ontology matching research. |
Keywords: |
Context, Knowledge Management, Ontology matching, Semantic web, Semantic
technologies, Semantic literature review. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
IDENTIFYING THE INFORMATION-SEEKING BEHAVIOURS AMONG SCHOOL OF COMPUTING
UNDERGRADUATE STUDENTS |
Author: |
MASLINDA MOHD NADZIR |
Abstract: |
The purpose of this study is to investigate information-seeking behaviours among
School of Computing undergraduate students within the context of searching
information for university tasks. As university students, having the skills to
gather information, both from the library and the Internet will help them to
complete university tasks such as course assignments and project papers. With
the advent of information technology, gathering information is so much faster
and easier. It is therefore imperative that undergraduates are equipped with
some information-seeking skills. A survey method was used to collect data for
this study. A questionnaire was randomly distributed to 170 undergraduate
students at School of Computing, Universiti Utara Malaysia in Malaysia. The
overall response rate was 100%. The collected data was analysed by using
Statistical Package for Social Sciences (SPSS) for Windows version 19.0. The
findings show that academic information is the most needed information for
completing university tasks and Google search engine is the most frequently used
search engine by the students when searching academic information. Therefore,
this study concludes that most of the students prefer to use search engines to
search academic information. |
Keywords: |
Information needs, information search, information-seeking behaviour,
undergraduate student |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
PERSONALIZING E-LEARNING SYSTEM FOR COURSES USING PREFIX SPAN ALGORITHM |
Author: |
S.MURUGANANDAM, DR.N.SRINIVASAN |
Abstract: |
With a great variation of topics and titles in a particular course and users’
reading or learning behaviors, the arrangement of the topics in the E-Learning
system is more beneficiary than the sequential pattern which are normally used.
It has been proved that by mounting a model with the help of Prefix Span
Algorithm. The algorithm is used to mine the users learning style and guiding
the learners about the sequence of the topics on the order mined learning
patterns. The experimentation is performed by comparing the satisfaction levels
of users using the normal sequential pattern and the patterns mined by using the
Prefix Span algorithm. |
Keywords: |
Learning System, Personalization, learner profile, Sequential pattern mining,
Prefix Span Algorithm, Sequential mining |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
ODMM - AN ONTOLOGY BASED DEEP MINING METHOD TO CLUSTER THE CONTENT FROM WEB
SERVERS |
Author: |
S.GANESH KUMAR, Dr.K.VIVEKANANDAN |
Abstract: |
ODMM aims to present a novel ontology-based content-mining approach to cluster
research proposals deep web search based on their similarities. The used method
is efficient and effective for clustering research proposals with English texts.
Text-mining methods have been proposed to solve the problem by automatically
classifying text documents. Current search methods for grouping proposals are
based on manual matching of similar search discipline keywords. The advantages
of this method are that it can extract three types of data records, namely,
synonyms data records, hypernymy data records, and hyponyms data records, and
also provides options for aligning iterative and disjunctive data items. The
proposed ODMM is used together with statistical method and optimization models
and consists of reference to the ontology; the new proposals in each discipline
are clustered using a self-organized mapping (SOM) algorithm. The SOM algorithm
is a typical unsupervised learning neural network model that clusters input data
with similarities. Our new techniques used in data extraction from deep webs
needs to be improved to achieve the efficiency and finally the result is given
like the result comes for user query on multi view point like web links, news
contents and the synonym hyponym and hypernym for the input term specified. |
Keywords: |
Semantic Web Services, ontology, mapping, ontology search engine |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
A RELIABILITY ESTIMATION MODEL USING INTEGRATED TASKS AND RESOURCES |
Author: |
MOHD ADHAM ISA, DAYANG NORHAYATI ABANG JAWAWI |
Abstract: |
With the growing size of modern systems, the composition of a number of
resources for a system is becoming increasingly more complex. Thus, a
reliability analysis for that system is essential, especially during design
time. The reliability estimation model is rapidly becoming a crucial part of the
system development life cycle, and a new model is needed to enable an early
analysis of reliability estimation, especially for the system under study.
However, the existing approach neglects to address the correlation between
resource and system task for estimation of system reliability. This subsequently
restricts the accuracy of estimation results and thus, could misguide the
reliability analysis in general. This paper proposes a reliability estimation
model that enables the computation of the system reliability as a product of
resource and system task. The system task reliability is treated as a transition
probability that the resource may execute for subsequent resources. To validate
the model, one real case study is used and the accuracy of the estimation result
is compared with the actual reliability values. The result shows the estimation
accuracy is considered at an acceptable level and some of the scenarios have
recorded higher accuracy values than previous models. To evaluate our model, the
result is compared with that of the existing model and shows our model providing
a more accurate estimation for a more complex scenario |
Keywords: |
System reliability estimation, graph-theory, white-box test |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
CONTENT BASED IMAGE RETRIEVAL FOR MOBILE SYSTEMS |
Author: |
P.JEYANTHI |
Abstract: |
This paper proposes, a hybrid approach employing texture and colour feature is
investigated. A modified approach for performing texture based feature
extraction by gray level co-occurrence matrix and colour based feature
extraction by colour cooccurrence vector. The Euclidean distance classifier is
used for finding the similarity measures between the query image and the
database image. In our proposed system we integrate the colour based image
retrieval and texture based image retrieval. The images retrieved by integrating
the above features, are ranked using Genetic Algorithms (GA).This content based
image retrieval (CBIR) implemented for android mobile system. |
Keywords: |
Android Mobile System, Color Histogram, Gray Level Co-Occurrence Matrix (GLCM),
Content Based Image Retrieval (CBIR), Feature Extraction, Euclidean distance. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
SPEED BACKSTEPPING CONTROL OF THE DOUBLE-FED INDUCTION MACHINE DRIVE |
Author: |
MOHAMMED TAOUSSI, MOHAMMED KARIM, BADRE BOSSOUFI, DALILA HAMMOUMI, AHMED
LAGRIOUI |
Abstract: |
This paper presents a new strategy to improve the performances of speed control
of a Double-Fed Induction Machine (DFIM), whose stator and rotor windings are
connected to a voltage inverter PWM (Pulse Width Modulation) independently. This
work shows the robustness of the adaptive Backstepping control strategy applied
to the DFIM. The main objective of this work is to stabilize the speed of the
machine to be used in the Aeolians systems. The overall stability of the system
is shown through using Lyapunov technique. Therefore, this paper presents the
study and analysis of the Backstepping control. Finally, the simulation results
of the Backstepping technique are valid on Matlab / Simulink, followed by a
detailed analysis and clearly show that the proposed system provides good static
and dynamic performance. |
Keywords: |
Double-fed Induction machine (DFIM); Backstepping control non-adaptive; PWM;
Robustness. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
A NOVEL APPROACH FOR FINGERPRINT RECOGNITION WITH DYNAMIC TIME WARPING |
Author: |
VENKATRAMAPHANIKUMAR S, V KAMAKSHI PRASAD |
Abstract: |
Biological features such as face, fingerprint, palm print and iris are widely
used for the identification of a human. Due to low cost acquisition devices and
high accuracy, fingerprint recognition is broadly used. Fingerprint is unique
and pattern unchanged during the life time. A novel approach for fingerprint
recognition system based on Gabor Wavelets and Dynamic time Warping is proposed
in this work. Firstly, image enhancement has done in both spatial and frequency
domains with Histogram Equalization and Fast Fourier Transform. 24 optimized
Gabor kernels are generated in spatial and frequency domains and those are
invariant to 6 orientations and 4 scales. In this work, the performance of the
proposed method is evaluated on VFR and FVC-2006 databases. Based on the
constraints of Dynamic Time Warping, an optimal warp has identified among the
feature vectors and achieved 94.2% and 83.6% recognition rates on VFR and
FVC-2006 databases respectively. |
Keywords: |
Fingerprint Recognition, Histogram Equalization, Fast Fourier Transform, Gabor
Wavelets, Dynamic Time Warping |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
DETECTION OF VIDEO FORGERY: A REVIEW OF LITERATURE |
Author: |
OMAR ISMAEL AL-SANJARY , GHAZALISULONG |
Abstract: |
In the current times the level of video forgery has increased on the internet
with the increase in the role of malware that has made it possible for any user
to upload, download and share objects online including audio, images, and video.
Specifically, Video Editor and Adobe Photoshop are some of the multimedia
software and tools that are used to edit or tamper medial files. Added to this,
manipulation of video sequence in a way that objects within the frame are
inserted or deleted are among the common malicious video forgery operations. In
the present study, literature concerning video forgery is reviewed primarily
those that use several video forgery detection in the form of passive blind
method on three types of forgery namely cloning forgery, source cameral
identification and splice forgery. The present study employed a video
authentication method that detects and determines both region duplication and
frame duplication in terms of video forgery, and locates factors that impact
video forgery. In the present study, video processing into sub-blocks and the
moments geometric features for every macro-block were extracted. This led to the
enhanced accuracy of detection. Moreover, the optimum sorting algorithm led to
minimized computational time taking account number of blocks and features
numbers into consideration. |
Keywords: |
Video Forgery Detection, Group Of Pictures (GOP), Copy–Move Forgery Detection |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
SCALABLE MULTIDIMENSIONAL ANONYMIZATION ALGORITHM OVER BIG DATA USING MAP REDUCE
ON PUBLIC CLOUD |
Author: |
AMALRAJ IRUDAYASAMY, DR. AROCKIAM L |
Abstract: |
It appears that everybody observes with special attention, the occurrence of big
data and its practice. There is no disbelief that the big data uprising has
instigated. Though the practices of big data propose favorable business
paybacks, there are substantial privacy implications. Multidimensional
generalization anonymization scheme is an actual method for data privacy
preservation. Top-Down Specialization (TDS) and Bottom-Up generalization (BUG)
are two methods to attain multidimensional anonymization. However, prevailing
methodologies for multidimensional generalization anonymization scheme
disconcerts parallelization proficiency, thereby missing scalability while
managing big data on cloud. TDS and BUG suffer from poor performance for certain
value of k-anonymity parameter if they are utilized individually. In this paper,
we recommend a hybrid method that combines TDS and BUG together for competent
multidimensional anonymization over big data. Additionally, Map reduce based
algorithms for two components (TDS and BUG) to increase high scalability cloud
are designed. Experiment estimations determine that the hybrid method
expressively progresses the scalability and proficiency of multidimensional
generalization anonymization system over prevailing methods. |
Keywords: |
Big data; cloud computing; data anonymization; privacy preservation; Map reduce |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
E-WASTE HANDLING IN DKI JAKARTA PRIVATE HIGHER EDUCATION INSTITUTION |
Author: |
NURIL KUSUMAWARDHANI SOEPRAPTO PUTRI, HUDIRARTO, ARGOGALIH, HANDIMULJOREDJO |
Abstract: |
The growth of electronically usage has given rise to a new environmental
challenge, known as electronic waste. The negative impact that is caused from
the waste of electrical and electronic equipment (WEEE), has forced many
institutions to take some serious action. In the other hand the dependencies
toward electronic devices are getting higher, especially in the education area.
This paper discusses the E-Waste handling in the area of private higher
education institution. Private higher education institution gives significantly
contribution to the increasing of the electronic waste. The use of technology
has grown rapidly caused by the growth of the student’s intake as well as the
need to provide them with the latest technology. However, E-Waste model has been
developed and still under reviewed continuously. |
Keywords: |
E-Waste, Electronic Waste, Green Computing, Green IT |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
PROTOTYPING, TESTING AND CONTROL ENERGY FOR ACTIVE-REGENERATIVE ELECTROMAGNETIC
SHOCK ABSORBER BASED QUARTER CAR MODEL |
Author: |
ARIF INDRO SULTONI, I NYOMAN SUTANTRA, AGUS SIGIT PRAMONO |
Abstract: |
In this paper, electromagnetic shock absorber for passenger car is designed and
fabricated. Series of experiment is conducted to attain damping and current
constant. Controllers are designed for energy regeneration and comfort based
quarter car model. Shock absorber is designed and prototyped to absorb vibration
energy and dissipate the energy as control actuation. The shock absorber use DC
permanent magnet motor to absorb and dissipate power. Prototype is tested on
Auto Damping Test Machine (ADFT). With the decreasing of external load, damping
force and generated current are increased. Formulation of DC motor for
electromagnetic damper is developed. Model of active-regenerative
electromagnetic suspension and some controller strategies is simulated based to
test results. Tracking reference, PI-tracking reference and Multi objective H∞
controller are presented and compared to determine their performance due to
comfort, energy regeneration and consumption. Simulations are carried out with
unevenness road input. Amount of regenerated energy is harvested when system at
passive mode with high body acceleration. Multi objective controller is satisfy
to maintain body acceleration, suspension travelling and tire deflection with
minimum power requirement. PI-tracking reference controller has less body
acceleration, suspension travelling and tire deflection compare to tracking and
H∞ multi objective controller but it need the highest power consumption.
Averages of RMS body acceleration are: 1.81, 0.59, 0.42, 0.46 m/sec2
respectively for passive, tracking reference, PI-tracking reference and multi
objective H∞ controller and power consumption are: 49.58 Watt, 72.11 Watt, 51.49
Watt, respectively for tracking reference, PI-tracking reference and multi
objective H∞ controller when 20Ω electric load applied for the prototype with C
Class road input and 50km/h vehicle speed. Average power regeneration at passive
mode is: 19.83Watt that complies with experiment result. |
Keywords: |
Electromagnetic Shock Absorber, Passenger Car, Control Energy,
Active-Regenerative. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
A NOVEL RULE BASED APPROACH FOR ENTITY RELATIONS EXTRACTION |
Author: |
MUJIONO SADIKIN, ITO WASITO |
Abstract: |
There are always new challenges in the extraction of object (entity) relations
contained in unstructured or semi structured text documents found in the
Internet, due to the volume of the documents, the evolution of the text
language, and the fast Internet growth. In this paper, the authors present the
description and the experimental results of a novel role based approach in
mining the entities and its relations. The proposed method defines a new concept
of entity relationship which treat entities relation as the relation of the main
object and its supporting object. The relation between these objects are
extracted through pattern learning process that utilize the Indonesia WordNet as
an external knowledge. Based on the performance evaluation of the proposed
method, it can be confirmed that it is feasible to apply the method in the area.
The feasibility of the method is measured by the accuracy of the extraction
process in 10 experiments. The average F-score values for the experiments are
0.895 and 0.795 in main object extraction and supporting object extraction
respectively. |
Keywords: |
object extraction relation, object, object interaction, pattern learning, tuple
scoring |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
DESIGN OF LOW POWER REDUCED WALLACE MULTIPLIER WITH COMPACT CARRY SELECT ADDER,
HALF ADDER & FULL ADDER USING CMOS TECHNOLOGY |
Author: |
P.RADHIKA, Dr.T.VIGNESWARAN |
Abstract: |
The Wallace Multiplier is mainly used in the Arithmetic & Logic Unit (ALU) to
perform the scientific computation in processors, controller etc... The existing
multiplication technique like booth multiplier, array multiplier etc requires
more time in multiplications. Hence Wallace Multiplier has been designed by
using the parallel process to reduce the delay. The regular Wallace Multiplier
requires more number of half adder and full adders in the reduction phase. So
the chip size (Area) is high in the regular Wallace multiplier. The complexity
reduced Wallace multiplier has been designed with less number of half adder and
full adders. In this paper, the compact carry select adder, half adder and full
adder are designed, which has been incorporated into the complexity reduced
Wallace multiplier to reduce the area and delay than the existing reduced
Wallace multiplier. The compact half adder and full adder are designed by using
static CMOS technology with 6 transistors and 16 transistors instead of 12
transistors and 24 transistors. Also the compact carry select adder has been
constructed based on compact half adder with 6 transistors, compact AND, OR and
XOR gates with 4 transistors for each gate. So the proposed complexity reduced
Wallace multiplier offers low power and less area than the existing Wallace
multiplier. Simulation is performed by using Tanner Tool v14.1. |
Keywords: |
Static CMOS technology, Compact half adder & full adder, complexity reduced
Wallace Multiplier, reduced carry select adder and Tanner Tool. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
Title: |
ANALYSIS AND EFFICIENCY OF ERROR FREE COMPRESSION ALGORITHM FOR MEDICAL IMAGE |
Author: |
J HEMAMALINI, D KAAVYA |
Abstract: |
Now a day’s number of medical images are generated in scan center. These are
emerging tool to diagnose diseases. The storage of an image is an economical
problem of scan center. Hence digital image compression is necessary in order to
solve the problem. Normally during compression, loss of data is not acceptable.
Hence the scan images have required high resolution and without any loss of data
in a compressed and decompressed digital image. To solve this problem, this
paper implements an algorithm known as 8 bit/pixel code string algorithm which
is based on pixel redundancy reduction by formulating matrices. The formulating
matrices are BM (Binary Matrix) and GSM (Gray Scale Matrix) which is used in
compression and decompression process. This method will be useful in
Telemedicine, Teleradiology purposes. |
Keywords: |
Digital image compression, 8 bit/pixel code string, BM, GSM. |
Source: |
Journal of Theoretical and Applied Information Technology
20th April 2015 -- Vol. 74. No. 2 -- 2015 |
Full
Text |
|
|
|