|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
| |
|
Journal of Theoretical and Applied Information Technology
31st January 2009 | Vol. 5 No. 1 |
Title: |
ROUGH SET PROTEIN CLASSIFIER |
Author: |
Ramadevi Yellasiri1, C.R.Rao |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
Classification of voluminous protein data
based on the structural and functional properties is a challenging task for
researchers in bioinformatics field. In this paper a faster, accurate and
efficient classification tool has been developed, which is a hybridization of
Sequence Arithmetic, Rough Set Theory and Concept Lattice. It reduces the domain
search space to 9% without losing the potentially of classification of protein. |
|
Full Text |
|
Title: |
SPATIAL GENERAL EPIDEMIC MODEL |
Author: |
M. Thiyagarajan, N. Rajagopal |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
The general consideration in Spatial
General Epidemic Model is controlling the propagation of the disease starting
from an infected individual. By a suitable control process, this epidemic can be
rooted out. Here, we suggest the control process using the percolation
probability to be attached with the formation of edges having the suitable
probabilities, giving the random graphs which will explain the extinction
probability of the epidemic. |
|
Full Text |
|
Title: |
HARDWARE IMPLEMENTATION OF A TOPOLOGY
CONTROL ALGORITHM FOR MANETS USING NOMADIC COMMUNITY MOBILITY MODEL |
Author: |
Surendra S. Dalu, M. K. Naskar and C. K.
Sarkar |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
Recently mobile ad hoc network has
attracted great interest with a variety of real applications. Rigorous research
work, mainly on efficient routing protocol designs has been done and numerous
MANET routing protocols have been developed. While designing the routing
protocols it is assumed that the network is fully connected (i. e. there are no
partitions). Mobility plays an important role in MANET. Relative node movement
can break links and thus change the topology. In this paper we propose a
physical implementation of a topology control algorithm for MANETs. The proposed
algorithm maintains the topology without any control message. There is no need
to change routing table as connectivity of the network is maintained all
through. Each mobile node is equipped with a transceiver and a GPS receiver.
Every node in the network is free to travel with its own velocity. Individual
node can take the decision on its own to change the position for maintaining the
connectivity with the reference node. They can roam around a reference node.
Results obtained through the experimentation with the prototype developed,
demonstrate that the connectivity and hence the topology of the network is
always maintained. |
|
Full Text |
|
Title: |
EFFECT OF FUZZY RESOURCE ALLOCATION
METHOD
ON AIRS CLASSIFIER ACCURACY |
Author: |
Shahram Golzari, Shyamala Doraisamy, Md
Nasir Sulaiman, Nur Izura Udzir |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
Artificial Immune Recognition System
(AIRS) is immune inspired classifier that competes with famous classifiers. Many
researches have been conducted to improve the accuracy of AIRS and to
investigate the source of power of AIRS. Some of these researches have focused
on resource allocation method of AIRS.
This study investigates the difference between the accuracy of AIRS with fuzzy
resource allocation and the accuracy of original AIRS, by using the reliable
statistical method. The combination of ten fold cross validation and t-test was
used as evaluation method and algorithms tested on ten benchmark datasets of
UCI machine learning repository. Based on the results of experiments, using
fuzzy resource allocation increases the accuracy of AIRS in majority of datasets
but the increase is significant in minority of datasets. |
|
Full Text |
|
Title: |
AN OVERVIEW OF TECHNIQUES FOR REDUCING
PEAK TO AVERAGE POWER RATIO AND ITS SELECTION CRITERIA FOR ORTHOGONAL FREQUENCY
DIVISION MULTIPLEXING RADIO SYSTEMS |
Author: |
V. Vijayarangan1, DR. (MRS) R. Sukanesh |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
The concept of Orthogonal Frequency
Division Multiplexing (OFDM) has been known since 1966, but it only reached
sufficient maturity for deployment in standard systems during 1990s. OFDM is an
attractive modulation technique for transmitting large amounts of digital data
over radio waves. One major disadvantage of OFDM is that the time domain OFDM
signal which is a sum of several sinusoids leads to high peak to average power
ratio (PAPR). Number of techniques have been proposed in the literature for
reducing the PAPR in OFDM systems. In this paper the various techniques proposed
for reducing the PAPR and the selection criteria for choosing these techniques
have been discussed. The goal is to convey the fundamental ideas and intuitive
understanding of the concept introduced. This is done primarily to give an
overview of the various techniques known today for PAPR reduction. |
|
Full Text |
|
Title: |
NEURAL NETWORKS IN DATA MINING |
Author: |
Dr. Yashpal Singh, Alok Singh Chauhan |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
Companies have been collecting data for
decades, building massive data warehouses in which to store it. Even though this
data is available, very few companies have been able to realize the actual value
stored in it. The question these companies are asking is how to extract this
value. The answer is Data mining. There are many technologies available to data
mining practitioners, including Artificial Neural Networks, Regression, and
Decision Trees. Many practitioners are wary of Neural Networks due to their
black box nature, even though they have proven themselves in many situations.
This paper is an overview of artificial neural networks and questions their
position as a preferred tool by data mining practitioners. |
|
Full Text |
|
Title: |
A Statistical Approach for Improving the
Availability of a 220 kV Extra High-Tension Feeder Network |
Author: |
K. Srinivas and R.V.S. Satyanarayana |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
An electrical power utility is expected
to maintain power quality and reliability. Electrical power reaches the consumer
through a network of 400, 220, 132, 33 and 11 kV feeders. An electrical power
utility is responsible for minimizing the number of interruption of these
feeders. A regression based statistical method has been developed to forecast
the number of interruptions. This approach utilizes historical data for evolving
a strategy to take corrective actions to minimize the number of future
interruptions. This paper proposes a scientific way for facilitating
uninterrupted power supply to consumers. |
|
Full Text |
|
Title: |
Markov method based Reliability
Assessment of EHT Transmission system in Chittoor district of Andhra Pradesh
state in India |
Author: |
K. Srinivas and R.V.S. Satyanarayana |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
This paper proposes an application of
Markov model based reliability assessment for EHT Transmission system involving
132 kV & 220 kV level voltages in Chittoor District. |
|
Full Text |
|
Title: |
STUDYING THE FEASIBILITY AND IMPORTANCE
OF GRAPH-BASED IMAGE SEGMENTATION TECHNIQUES |
Author: |
Dr.S.V.Kasmir Raja, A.Shaik Abdul Khadir,
Dr.S.S.Riaz Ahamed |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
Image segmentation and its performance
evaluation are very difficult but important problems in computer vision. A major
challenge in segmentation evaluation comes from the fundamental conflict between
generality and objectivity: For general-purpose segmentation, the ground truth
and segmentation accuracy may not be well defined, while embedding the
evaluation in a specific application, the evaluation results may not be
extensible to other applications. This paper analyzes the performance of
Normalized Cut (NC) and Efficient Graph (EG) methods of Image Segmentation. We
treat image segmentation as graph partitioning problem and propose novel global
criterion, NC for segmenting the graph. The NC criterion measures both total
dissimilarity between the different groups as well as the total similarity
within the groups. We apply efficient graph based image segmentation method to
image segmentation using two different kinds of local neighbourhood in
constructing the graph. We also present a special strategy to compare and
analysis the above two graph-based methods |
|
Full Text |
|
Title: |
RANDOM QUAD TREE AS SPATIAL DATA MINING
TOOL |
Author: |
M. Thiyagarajan , N. Rajagopal |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
Among the different approaches to
investigate in detail the extinction of population of rare species the Birth and
Death process has given the simplest approach to settledown the solution. Here,
an attempt is made to obtain the extinction probabilities and to generate a
different levels of population sizes. A random quad-tree method is footforth. We
give the basic concepts and results used in the subsequent sections. |
|
Full Text |
|
Title: |
A NEW METHOD TO INCORPORATE FACTS DEVICES
IN OPTIMAL POWER FLOW USING PARTICLE SWARM OPTIMIZATION |
Author: |
K.Chandrasekaran, K.Arul jeyaraj,
L.Sahayasenthamil, Dr. M.Saravanan |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
In this work, Particle Swarm Optimization
(PSO) for the solution of the optimal power flow (OPF) with use of controllable
FACTS devices is studied. Two types of FACTS devices, thyristor controlled
series compensator (TCSC) and thyristor-controlled phase shifters (TCPS) are
considered in this method. The specified power flow control constraints due to
the use of FACTS devices are included in the OPF problem in addition to normal
conventional constraints. The sensitivity analysis is carried out for the
location of FACTS devices. This method provides an enhanced economic solution
with the use of controllable FACTS devices. IEEE standard 30-bus system is taken
and results have been compared with GA to show the feasibility and potential of
this PSO approach. |
|
Full Text |
|
Title: |
INNOVATIVE THINNING AND GRADIENT
ALGORITHM FOR EDGE FIELD AND CATEGORIZATION SKELETON ANALYSIS OF BINARY AND GREY
TONE IMAGES |
Author: |
Mr. R.M.Noorullah and Dr.A.Damodaram |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
A commonly used method for thinning
regions on binary images consists of examining windows of 3 x 3 pixels
throughout an image, and erasing the enter pixel if thinning criteria are met.
Critical study will be made in present research to develop a new thinning
algorithm based on kXk windows for the categorization and skeleton analysis of
images. The advantage of this algorithm is they peel thick layers from the
boundaries of image regions and also reduces overall iterations of thinning
algorithm and reduces overall complexity of these proposed methods. |
|
Full Text |
|
Title: |
MOVING TOWARD REGION-BASED IMAGE
SEGMENTATION TECHNIQUES: A STUDY |
Author: |
Dr.S.V.Kasmir Raja, A.Shaik Abdul Khadir,
Dr.S.S.Riaz Ahamed |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
Image segmentation and its performance
evaluation are very difficult but important problems in computer vision. A major
challenge in segmentation evaluation comes from the fundamental conflict between
generality and objectivity: For general-purpose segmentation, the ground truth
and segmentation accuracy may not be well defined, while embedding the
evaluation in a specific application; the evaluation results may not be
extensible to other applications. In this paper, we compare the performances of
the two popular region-based image segmentation methods namely the Watershed
method and the Mean-shift method. The watershed method, also called the
watershed transform, is an image segmentation approach based on mathematical
morphology. Mean-shift method is a data-clustering method that searches for the
local maximal density points and then groups all the data to the clusters
defined by these maximal density points. |
|
Full Text |
|
Title: |
MULTI-AREA SECURITY CONSTRAINED ECONOMIC
DISPATCH BY FUZZY- STOCHASTIC ALGORITHMS |
Author: |
Prasanna. T.S, Somasundaram. P |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
This paper presents two new
computationally efficient Fuzzy-Stochastic algorithms for solving Security
Constrained Economic Dispatch (ED) in interconnected power system. The proposed
algorithms are based on the combined application of Fuzzy logic strategy
incorporated in both Evolutionary Programming (EP) and Tabu-Search (TS)
algorithms. The main objective of the Multi Area Economic Dispatch (MAED) is to
determine the generation allocation of each committed unit in the system and
power exchange between areas so as to minimize the total generation cost without
violating the tie-line security constraint. The proposed methods are tested on
IEEE – 30 bus interconnected three area system. The investigation reveals that
the proposed methods can provide accurate solution with fast convergence and has
the potential to be applied to other power engineering problems. |
|
Full Text |
|
Title: |
THE ROLE OF KALMAN FILTER IN THE MODELLING
OF GPS ERRORS |
Author: |
B.L.Malleswari, I.V.MuraliKrishna,
K.Lalkishore, M.Seetha, Nagaratna, P. Hegde |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No1. -- 2009 |
Abstract |
This paper describes about the modeling
of errors (like ionospheric delays, atmospheric delays, Tropospheric delays,
Multipath effects and dilution of precision etc.,) affecting the GPS signals as
they travel from satellite to user on Earth. These errors degrade the accuracy
of GPS position. An attempt is made to improve the accuracy in locating the GPS
receiver by filtering the range measurements with the datum conversion between
Universal Transverse Mercator (UTM) and World Geodetic System (WGS - 84) using
single frequency ML-250 hand held GPS receiver and smoothening of these
coordinates using Kalman filter. A linear recursive filtering technique, Kalman
filter is used for greater accuracy in estimating the position of user by
considering the initial state of the system, statistics of system noise and
measurement errors from sensor noise measurements. The results of proposed
Kalman filter technique give better accuracy with more consistency and are found
superior to the standard one |
|
Full Text |
|
|
|