|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
| |
|
Journal of Theoretical and Applied Information Technology
31st May 2009 | Vol. 5 No. 5 |
Title: |
Measuring the Effectiveness of Open
Coverage based Testing Tools |
Author: |
Ms. L.Shanmuga Priya, Ms.A.Askarunisa,
Dr. N.Ramaraj |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
The levels of quality, maintainability,
testability, and stability of software can be improved and measured through the
use of automated testing tools throughout the software development process.
Automated testing tools assist software engineers to gauge the quality of
software by automating the mechanical aspects of the software-testing task.
Automated testing tools vary in their underlying approach, quality, and
ease-of-use, among other characteristics In Software testing; Software Metrics
provide information to support a quantitative managerial decision-making for the
test managers. Among the various metrics, Code coverage metric is considered as
the most important metric often used in analysis of software projects in the
industries for testing, Code coverage analysis also helps in the testing process
by finding the areas of a program not exercised by a set of test cases, creating
additional test cases to increase coverage, and determine the quantitative
measure of the code, which is an indirect measure of quality. The test manager
needs coverage metric in making decisions while selecting test cases for
regression testing. In literature there are a large number of automated tools to
find the coverage of test cases in Java. Choosing an appropriate tool for the
application to be tested may be a complicated process for the test Manager. To
ease the job of the Test manager in selecting an appropriate tool, we propose a
suite of objective metrics for measuring tool characteristics as an aid in
systematically evaluating and selecting automated testing tools. |
|
Full
Text |
|
Title: |
DESIGN & SIMULATION OF OPTICAL FIBER
BRAGG GRATING PRESSURE SENSOR FOR MINIMUM ATTENUATION CRITERIA |
Author: |
Reema Sharma, Rajesh Rohilla,
Mohit Sharma, Dr. T.C.Manjunath |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
This paper presents the design &
simulation of an Optical Fiber Bragg Grating (OFBG) sensor for stress, strain
measurement and also demonstrates the methodology to arrive at the optimal
grating pitch dimensions for a given interrogating wavelength. The wavelength
chosen for interrogation of the Fiber Bragg Grating sensor is from the third
window so as to minimize attenuation of the light signal in the communication
link from Fiber Bragg Grating sensor to the electronic instrumentation. Before
actually inscribing the grating in the fiber, simulation tools provide valuable
help in optimizing the design parameters. The simulation results presented show
the effectiveness of the developed method, which can be further implemented in
real time for various industrial applications. |
|
Full
Text |
|
Title: |
A
NOVEL APPROACH FOR AN ACCURATE HUMAN IDENTIFICATION THROUGH IRIS RECOGNITION
USING BITPLANE SLICING AND NORMALISATION |
Author: |
Srinivasa Kumar Devireddy1 ,G.Ramaswamy, D.Ravikiran, P.Sireesha |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
Unlike other biometrics such as
fingerprints and face, the distinct aspect of iris comes from randomly
distributed features. This leads to its high reliability for personal
identification and at the same time, the difficulty in effectively representing
such details in an image. Iris recognition illustrates work in computer vision,
pattern recognition, and the man-machine interface. The purpose is real-time,
high confidence recognition of a person's identity by mathematical analysis of
the random patterns that are visible within the iris of an eye from some
distance. Iris is a protected internal organ whose random texture is stable
throughout life, it can serve as a kind of living password that one need not
remember but one always carries along. Because the randomness of iris patterns
has very high dimensionality, recognition decisions are made with confidence
levels high enough to support rapid and reliable exhaustive searches through
national-sized databases. Iris recognition has shown to be very accurate for
human identification. This paper proposes a technique for iris pattern
extraction utilizing the least significant bit-plane through binary morphology
applied to the bit-plane and by evaluating the standard deviation of the image
intensity along the vertical and horizontal axis, the pupillary boundary of the
iris is determined. The limbic boundary is identified by adaptive thresholding
method. Because the extraction approach restricts localization techniques to
evaluating only bit-planes and standard deviations, iris pattern extraction is
dependent on circular edge detection. The iris normalization was invariant for
translation, rotation and scale after mapping into polar coordinates. Experiment
and results show that the proposed method has an encouraging performance, shows
98.7% localization and normalization success and reduces the system operation
time. The proposed method involves Bit plane slicing, Standard deviation
windows, Adaptive thresholding, Normalization modules. |
|
Full
Text |
|
Title: |
REAL TIME RADAR IMAGING: AN IMPACT ON
NAVIGATION IN WIRELESS COMMUNICATION AND ITS SYSTEMS |
Author: |
PROF. NIRMALENDU BIKAS SINHA, DR. R. BERA,
DR. M. MITRA. |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
We are proceeding towards a broadband age
both for the communication as well as remote sensing. Recently, CALM standard
has already been formulated by ITU with objectives of collision avoidance
between car to car as well as car to roadside communication and safety by
utilization of Sensor-friendly vehicle. Millimetre wave radar is mandatory in
every car to locate accurate position. Lots of works can be thought of relating
to the radar. In radar, a target is characterized by its radar cross section
(RCS) function. This paper presents an overview of recent progress of radar for
the image detection and characterization of targets. Radar Cross section
measurement capability of full scale targets of interest is an utmost
requirement for country like INDIA, considering the nature of threat it has from
its neighboring countries. The full scale facility helps in identifying the
radar signature of various operational configurations of different class of
targets present in our country as well as with our neighbors. This will help in
fine tuning and judging the survivability/ striking capability of our weapons.
The system is tested to find out RCS of a simple as well as complex object and
its imaging by hardware and software .The results are very much useful in
quantifying the target different parameters. Finally, the paper addresses the
current questions regarding the integration of radar in practical wireless
systems and standards |
|
Full
Text |
|
Title: |
AN EFFICIENT PATTERN MINING ANALYSIS IN
HEALTH CARE DATABASE |
Author: |
E.Ramaraj, N.Venkatesan |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
Association Rules are discovered by
identifying relationships among sets of items in a transaction database with two
measures which quantify the support and confidence of the rule. Finding frequent
itemsets is computationally the most expensive step in Association Rule
discovery and therefore, it has attracted significant research attention. This
paper reviews Apriori related and Eclat algorithms with detailed discussion
about various data structures. Computation are made for our own surveyed data
sets and compared. The analysis ends with various research issues like types of
rules, execution time and space complexity of algorithms. |
|
Full
Text |
|
Title: |
AN EFFECTIVE FUZZY C-MEAN AND TYPE-2
FUZZY LOGIC FOR WEATHER FORECASTING |
Author: |
AHMAD SHAHI, RODZIAH BINTI ATAN, MD.
NASIR SULAIMAN |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
Meteorological forecasting is applicable
for versatile applications. Accurate weather prediction saves lives, money and
time in both local and global area. Forecasting accuracy is still not accurate
because of the uncertain (fuzzy) data of nature, due to several reasons
including: incomplete data, hand writing error, device error, precision of
measurements and discreet description of connective phenomena Inherent part
reflecting our understanding of things. On the other hand in global area with
large amount of data to process whole the data is time consuming, thus, to
improve the quality of data and execution time, we need to manage the
uncertainty of data and extract desired data. Therefore the uncertainty
management and process the data demand intelligent methods with knowledge based
approaches. This paper reviews challenges in this field and compares advantages
and drawbacks of the existing methods that essentially are only applicable for
local area. Finally we proposed a hybrid technique for new research based on
fuzzy c-mean clustering technique and type-2 fuzzy logic that is useable in both
local and global area. Finally we show our experiments and prove that hybrid
technique performs better than existing weather prediction methods in low error
rate. |
|
Full
Text |
|
Title: |
FAIRNESS OF THE TCP-BASED
NEW AIMD CONGESTION CONTROL ALGORITHM |
Author: |
HAYDER NATIQ JASEM, ZURIATI
AHMAD ZUKARNAIN, MOHAMED OTHMAN, SHAMALA SUBRAMANIAM |
Source: |
Journal of Theoretical and
Applied Information Technology
Vol 5. No5. -- 2009 |
Abstract |
Congestion control is
one of the fundamental issues in computer networks. In transport control
protocol (TCP) the performance of protocol is being measured based on fairness
and efficiency. The fairness on round trip time (RTT) can be measured using the
algorithm based on two flows or more than that. Congestion control is an effort
to adapt the performance of a network to changes in the traffic load without
adversely affecting users perceived utilities. The traffic load in a network
will effect the performance of a network and the fairness of algorithm.
Congestion control is an introduced effect that adapts the performance. AIMD
(Additive Increase Multiplicative Decrease) is an established algorithm in a set
of liner algorithms that it reflects good efficiency as well as good fairness.
In this paper we propose an evaluation method of fairness for New AIMD
congestion control algorithm. The evaluation of fairness has been done by using
multiple flows start at the same time and also by considering each flow start at
a different time in other way. |
|
Full Text |
|
Title: |
NEURO-FUZZY CLASSIFIER FOR CARDIAC
ARRYTHMIAS RECOGNITION |
Author: |
R. BENALI, M. A. CHIKH |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
The premature ventricular contraction
(PVC) and the premature atrial contraction (PAC) are cardiac arrhythmias which
are widely encountered in the cardiologic field. They can be detected using the
electrocardiogram signal parameters. We use in this work a Neuro-fuzzy approach
to identify these abnormal beats. To achieve this objective we have developed a
Neuro-Fuzzy Classifier (NFCL), its performances were evaluated by computing the
percentages of sensitivity (Se), specificity (Sp) and correct classification
(CC). This classifier allows extraction of rules (knowledge base) to clarify the
results obtained. We use the medical database (MIT-BIH) to validate our results. |
|
Full
Text |
|
Title: |
ULTRA LOW POWER DIGITAL LOGIC CIRCUITS IN
SUB-THRESHOLD FOR BIOMEDICAL APPLICATIONS |
Author: |
K. Ragini, DR. B.K. Madhavi |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
Motivated by emerging battery operated
applications that demand intensive computation in portable environments,
techniques are investigated which reduce power consumption in CMOS digital
circuits, by operating the devices at low currents and low voltages[1,2,3,4].It
is known that MOS devices and circuits especially CMOS circuits consume
relatively low power[5,6] . But there seems to be a need to reduce this power
further to prolong the life of battery. One solution to achieve the ultra-power
requirement is to operate in sub-threshold region [7]. Over the last 10 years,
digital sub-threshold logic circuits have been developed for applications in the
ultra-low power design domain, where performance is not the priority.
Sub-threshold logic transistors, that is the power supply voltage is below the
threshold voltage. In this paper, analysis was done both CMOS and Pseudo-NMOS
logic families operating in sub-threshold region. The results were compared with
CMOS in normal strong inversion region. |
|
Full
Text |
|
Title: |
PERFORMANCE ENHANCEMENT OF STBC OFDM-
CDMA SYSTEM USING CHANNEL CODING TECHNIQUES OVER MULTIPATH FADING CHANNEL |
Author: |
Nelly Muhammad Hussein, Adel El- Sherif,
Abd El-Wahhab Fayez |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
In this paper, a mitigation method for
minimizing the distortion caused by frequency selective fast fading is
illustrated. The proposed system combines between two types of diversity
schemes; frequency diversity represented by the orthogonal frequency division
multiplexing (OFDM) scheme and antenna diversity represented by space time block
coding (STBC) scheme these are for mitigating the distortion caused by frequency
selective fading. The multiplexing technique proposed here is the code division
multiple access (CDMA) scheme which is considered the solution for eliminating
the distortion caused by fast fading. The main problem of the OFDM system is the
high peak-to-average power ratio (PAR). In the proposed system, three techniques
used for reducing the PAR are applied together. And in order to reduce the bit
error rate caused by intersymbol interference (ISI) or multiple access
interference (MAI) channel coding is applied which can make error detection and
correction at the receiver and simulation results have shown that by applying
any of channel coding techniques, the bit error rate (BER) has reached a
satisfying level at a low signal-to-noise ratio (SNR). |
|
Full
Text |
|
Title: |
APPLICATION OF TABU SEARCH ALGORITHM TO
SECURITY CONSTRAINED ECONOMIC DISPATCH |
Author: |
N.B. Muthuselvan and P. Somasundaram |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
This paper presents an algorithm for
solving Security Constrained Economic Dispatch (SCED) problem through the
application of Tabu Search (TS). The SCED problem is formulated with base case
and contingency case line flow constraints, which are important for practical
implementation. Two representative systems namely 66-bus [14] and 191-bus [15]
Indian utility systems are taken for investigations. The SCED results obtained
using TS are compared with those obtained using Genetic Algorithm (GA) and
Evolutionary Programming (EP). The investigations reveal that the proposed TS
algorithm is relatively simple, reliable and efficient and suitable for
practical applications. |
|
Full Text |
|
Title: |
IDENTIFICATION AND DELINEATION OF QRS
COMPLEXES IN ELECTROCARDIOGRAM USING FUZZY C-MEANS ALGORITHM |
Author: |
S.S. MEHTA, C.R.TRIVEDI, N.S. LINGAYAT |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
Over the past few years, there has been an
increased trend toward processing of the electrocardiogram (ECG) using
microcomputer. The system based on microcomputer can perform the needed medical
services in extremely efficient manner. In fact, many systems have already been
implemented to perform signal processing task such as 12-lead ECG analysis. All
these applications require an accurate detection of QRS complex of ECG. Thus QRS
complex detection is an important part of many ECG signal processing system.
This paper presents application of Fuzzy C-Means algorithm (FCM) for detection
of QRS complex in ECG signal. The performance of the algorithm is validated
using original 12-lead ECG recording from the standard ECG data base.
Significant detection rate is achieved. The onset and offset of the QRS
complexes are found to be within tolerance limit given by CSE library. |
|
Full Text |
|
Title: |
STUDY OF SPATIAL DIVERSITY SCHEMES IN
MULTIPLE ANTENNA SYSTEMS |
Author: |
R.DEEPA, Dr.K.BASKARAN, PRATHEEK
UNNIKRISHNAN, ASWATH KUMAR |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
One of the most common problems faced by
designers of wireless communication systems is the spatio-temporal variations in
the wireless channel which arise mainly due to the phenomenon of multipath
fading which is inevitable in scattering environments that are subject to
changes over time. A wide variety of pre/post processing techniques have been
used to mitigate the degrading effects of such channels, but with limited
improvements in performance. This paper provides an evaluation of the concept of
Spatial Diversity (SD), applied to multi-antenna systems that demand
modifications at the physical level - the use of multiple antennas at the
transmitter and/or the receiver. The performance of SISO, SIMO, MISO and MIMO
systems have been evaluated and compared in AWGN and fading channels. |
|
Full Text |
|
Title: |
EFFICIENT RESOURCE ALLOCATION IN MC-CDMA
CELLULAR WIRELESS NETWORKS TO SUPPORT MULTIMEDIA TRAFFIC |
Author: |
DHANANJAY KUMAR, C. CHELLAPPAN |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
The Multi Carrier Code Division Multiple
Access (MC-CDMA) system with time division duplex mode, adopting unbalanced slot
allocation between uplink and downlink, is a good solution for asymmetric
multimedia traffic. A centralized common slot allocation to minimize inter-cell
interference is not feasible as neighboring cells may carry different rates of
traffic load. In this paper, we investigate and discuss the effect of asymmetric
slot management strategy employing adaptive resource allocation in a MC-CDMA
system, in which each cell has its own slot allocation policy according to the
level of traffic load. The Bit Error Rate (BER) and Signal to Noise Ratio (SNR)
of MC-CDMA considering four cases of uplink and downlink between cells is
simulated separately in presence of Additive White Gaussian Noise (AWGN) and
Rayleigh channel. The simulation result shows that the distance ratio and
orthogonality factor are equally important parameter in BER performance, and
also an SNR of around 9dB is enough to bring BER below 10-3. |
|
Full Text |
|
Title: |
AGENT BASED RESOURCE BROKERING AND
ALLOCATION IN WIRELESS GRIDS: REVISITED |
Author: |
MINA SEDAGHAT, MOHAMED OTHMAN |
Source: |
Journal of Theoretical and Applied
Information Technology
Vol 5. No5. -- 2009 |
Abstract |
Existing internet frameworks, like grids,
wireless grids, P2P networks are now provide effective channels for gathering
and processing widespread information, using the available resources within
reasonable cost. Agent technology emerges as a suitable solution to combine with
these frameworks for covering the existing challenges, like discovery, load
balancing and resource management. Although different models introduced,
practical solutions remain elusive and tend to exhibit underlying conflicts
between different paradigms. This paper, aimed to revisit and discuss the
characteristics on one of these system architectures based on agents. The base
model is introduced, then our re-experiment is described and the results are
compared, such that the challenges on this agent base model is highlighted and
new ideas can be applied. |
|
Full Text |
|
Title: |
NON-DOMINATED RANKED GENETIC
ALGORITHM FOR SOLVING CONSTRAINED MULTI-OBJECTIVE OPTIMIZATION PROBLEMS |
Author: |
OMAR AL JADAAN, LAKSHMI
RAJAMANI, C. R. RAO |
Source: |
Journal of Theoretical and
Applied Information Technology
Vol 5. No5. -- 2009 |
Abstract |
Evolutionary algorithms are
becoming increasingly valuable in solving large-scale, realistic engineering
multiobjective optimization problems, which typically require consideration of
conflicting and competing design issues. A criticism of Evolutionary Algorithms
might be the lack of efficient and robust generic methods to handle constraints.
The most widespread approach for constrained search problems is to use penalty
methods, because of their simplicity and ease of implementation. Penalty
function is generic and applicable to any type of constraint (linear or
nonlinear). Nonetheless, the most difficult aspect of the penalty function
approach is to find appropriate penalty parameters. In this paper, a method
combining the new Non-dominated Ranked Genetic Algorithm (NRGA), with a
parameterless penalty approach are exploited to devise the search to find Pareto
optimal set of solutions, alleviate the above difficulties. The parameterless
penalty approach that does not require any penalty parameter where penalty
parameters assignment among feasible and infeasible solutions are made with a
view to provide a search direction towards the feasible region. The new
Parameterless Penalty and the Non-dominated Ranked Genetic Algorithm (PP-NRGA)
continuously find better Pareto optimal set of solutions. This new algorithm
have been evaluated by solving five test problems, reported in the
multi-objective evolutionary algorithm (MOEA) literature. Performance
comparisons based on quantitative metrics for accuracy, coverage, and spread are
presented. |
|
Full Text |
|
|
|