|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
May 2012 | Vol. 39 No.2 |
Title: |
DOUBLE COMPRESSION OF TEST DATA USING HUFFMAN CODE
|
Author: |
R.S.AARTHI, D. MURALIDHARAN, P. SWAMINATHAN
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 104 - 113 Vol 39. No. 2 -- 2012 |
Abstract |
Increase in design complexity and fabrication technology results in high test
data volume. As test size increases memory capacity also increases, which
becomes the major difficulty in testing System-on-Chip (SoC). To reduce the test
data volume, several compression techniques have been proposed. Code based
schemes is one among those compression techniques. Run length coding is one of
the most popular coding methodology in code based compression. Run length codes
like Golomb code, Frequency directed run Length Code (FDR code), Extended FDR,
Modified FDR, Shifted Alternate FDR and OLEL coding compress the test data and
the compression ratio increases drastically. For further reduction of test data,
double compression technique is proposed using Huffman code. Compression ratio
using Double compression technique is presented and compared with the
compression ratio obtained by other Run length codes.
|
Keywords |
Test Data Compression, Run Length Codes, Golomb Code, OLEL Code, Huffman Code
|
Full Text |
|
Title: |
CLOUD COMPUTING SOLUTION - BENEFITS AND TESTING CHALLENGES
|
Author: |
PRAKASH.V, GOPALAKRISHANAN.S
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 114 - 118 Vol 39. No. 2 -- 2012 |
Abstract |
Modern day Software companies needs fast, secure and scalable IT infrastructure,
in order to catch up with their ever growing needs of business. But, the
challenge lies in setting up this setup in their own premise. They have to spend
huge amount of money towards the growing needs of the IT infrastructure,
personnel and the expertise to administer. As a result the focus will be shifted
from their core business towards handling this burden. Here comes the Cloud
computing, a solution which helps organizations to focus on their core business
rather than worrying about the investment and maintenance of their IT
infrastructure. This paper focuses on the challenges faced by companies in
moving to a cloud environment, with respect to security, reliability, and
manageability, which the organizations should focus on only through rigorous
testing. The paper starts explaining the benefits of cloud computing and move
towards the testing challenges faced by testers
|
Keywords |
Testing, Cloud Testing, Cloud Testing Challenges
|
Full Text |
|
Title: |
A PARALLEL APPROACH FOR IMPROVING DATA SECURITY
|
Author: |
KARTHIKEYAN.S, SAIRAM.N, MANIKANDAN.G, SIVAGURU.J
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 119 - 125 Vol 39. No. 2 -- 2012 |
Abstract |
In today’s world internet is being used by almost everyone. Numerous file
exchanges take place online including many official documents. These files
require some sort of security mechanisms while being transmitted over the
internet. Cryptography is one such means by which a person can encrypt the data
and send it through the internet. This way the data is safe and unreadable for
the intruders. In this paper we have proposed a system which combines the
advantages of parallel processing and cryptographic algorithms. The use of
parallel processing enhances the speed of system when compared to the
traditional crypto systems. In our approach we have divided a file into two
slices and have applied a single algorithm with different key for each slice and
the processing of the algorithm is done in a parallel environment. From our
experiments we found out that the execution time of a cryptographic algorithm is
considerably reduced in a parallel environment when compared to the generic
sequential methods.
|
Keywords |
Cryptography, Parallel Processing, File Security, Symmetric-Key.
|
Full Text |
|
Title: |
SURVEY ON VARIOUS UI DESIGN APPROACHES
|
Author: |
SATHYA S, SIVARANJANI A, SANTHI B
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 126 - 131 Vol 39. No. 2 -- 2012 |
Abstract |
User Interface (UI) is the prime factor that decides the usability of any
products. Nowadays the developer must direct their effort towards the
development of UI that function effortlessly. The research in UI design has
gained a high attraction due to importance of interactive application and to
enhance the user experience. Tremendous approaches are there to design effective
UI. Each design approaches have their own special design aspects. In this paper
an attempt has been made to review the research studies on various design
approaches in the field of UI Design.
|
Keywords |
User Interface (UI), User Experience, UI Design, Design Approaches
|
Full Text |
|
Title: |
A MODIFIED HYBRID PARTICLE SWARM OPTIMIZATION ALGORITHM FOR SOLVING THE
TRAVELING SALESMEN PROBLEM
|
Author: |
SAID LABED , AMIRA GHERBOUDJ , SALIM CHIKHI
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 132 - 138 Vol 39. No. 2 -- 2012 |
Abstract |
The traveling salesman problem (TSP) is a well-known NP-hard combinatorial
optimization problem. The problem is easy to state, but hard to solve. Many
real-world problems can be formulated as instances of the TSP, for example,
computer wiring, vehicle routing, crystallography, robot control, drilling of
printed circuit boards and chronological sequencing. In this paper, we present a
modified hybrid Particle Swarm Optimization (MHPSO) algorithm in which we
combine some principles of Particle Swarm Optimization (PSO), the Crossover
operation of the Genetic Algorithm and 2-opt improvement heuristic. The main
feature of our approach is that it allows avoiding a major problem of
metaheuristics: the parameters setting. In the aim to prove the performance and
convergence of the proposed algorithm, we have used it to solve some TSP
instances taken from TSPLIB library. Moreover, we have compared our results with
those obtained by other algorithms based PSO.
|
Keywords |
Traveling Salesman Problem, Particle Swarm Optimization, Optimization,
Meta-heuristics
|
Full Text |
|
Title: |
FPGA BASED 3D MOTION SENSOR
|
Author: |
Y.NANDHINI, Asst.Prof. D. MURALIDHARAN
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 139 - 144 Vol 39. No. 2 -- 2012 |
Abstract |
The main objective of this paper is to explain the functioning of a device,
designed to sense the real time motion changes of an object. This motion sensing
device Digital Accelerometer can be used to sense any change in the position of
an object with respect to its original position, all along the three axes. This
Digital accelerometer is to be put into its function controlled by a Field
Programmable Gate Array (FPGA). The above motion sensing device connected with
the FPGA will also be useful in reducing complexity factors such as area
consumption, cost, etc in installing the objects and at the same time, the
device could also be introduced in an easier way. This paper is mainly to deal
with the experimental procedure in detecting such changes using this motion
sensing device Digital Accelerometer
|
Keywords |
Digital Accelerometer, Field Programmable Gate Array, MicroBlaze Processor,
Spartan6
|
Full Text |
|
Title: |
PASSWORD BASED TWO SERVER AUTHENTICATION SYSTEM
|
Author: |
VIGNESH KUMAR K, ANGULAKSHMI T, MANIVANNAN D, SEETHALAKSHMI R, SWAMINATHAN P
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 145 - 149 Vol 39. No. 2 -- 2012 |
Abstract |
User Authentication in computer systems is an important cornerstone in today’s
computer era. The concept of a user id and password is one of the easiest ways
for authentication. It is not only the easiest way, but also cost effective and
highly efficient. Today, we can see the password cracking and hacking in
everywhere. At present we are using the single server system for this sort of
password based authentication. Traditional protocols for password-based
authentication assume a single server which stores all the information (e.g.,
the password) necessary to authenticate a user. When an attacker obtains the
information stored on the server, he can obtain all the passwords which were
stored in the server via launching an off-line dictionary attack. To address
this issue, a number of schemes have been proposed in which a user’s password
information is shared among multiple servers, and these servers cooperate in a
threshold manner the user wants to authenticate. In this paper, a new efficient
two-server password-only based authentication is proposed. In addition, the
system is secure against offline dictionary attacks mounted by either of the two
servers.
|
Keywords |
Password Authentication, Two Server Concept, Offline and Online Dictionary
Attack
|
Full Text |
|
Title: |
THE USAGE OF CAPABILITY MATURITY MODEL INTEGRATION AND WEB ENGINEERING PRACTICES
IN LARGE WEB DEVELOPMENT ENTERPRISES: AN EMPIRICAL STUDY IN JORDAN
|
Author: |
OMAIMA N. A. AL-ALLAF
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 150 - 166 Vol 39. No. 2 -- 2012 |
Abstract |
There is a lack of surveys in large web development enterprises to determine:
the level of adoption of web engineering practices and capability maturity model
integration (CMMI) in these enterprises. Therefore, we need a survey in large
enterprises to improve their web development processes and overcome their
problems during the development process. We focus in this research on answering
four questions: What are the characteristics of developers working in large
enterprises? What are the properties of web development processes adopted by
large enterprises? What are the symptoms that large enterprises face during web
development? And finally, what are levels of usage of CMMI and web engineering
practices by these enterprises. A survey has been conducted in this research
based on questionnaires in large enterprises in Jordan to answer the above
questions. According to survey results, we noticed that: two of web engineering
practices such as tools and technology, and standards and procedures are
partially adopted, whereas organizational issues, web metrics, and control of
development process are barely used by these enterprises. We also noticed that
the majority of the respondents have not previously participated in CMMI
activities. Finally, recommendations are provided to improve web development
processes and overcome identified problems in these enterprises.
|
Keywords |
Large web applications, web engineering, web engineering practices, Capability
Maturity Model Integration (CMMI), software process improvement (SPI), Quality
Assurance (QA).
|
Full Text |
|
Title: |
SOFTWARE AGENTS PARADIGM IN AUTOMATED DATA MINING FOR BETTER VISUALIZATION USING
INTELLIGENT AGENTS
|
Author: |
R. JAYABRABU, Dr. V. SARAVANAN, Prof. K. VIVEKANANDAN
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 167 - 177 Vol 39. No. 2 -- 2012 |
Abstract |
Data Mining techniques plays a vital role like extraction of required knowledge,
finding useful information to make strategic decision in a novel way which in
term understandable by domain experts. Users like expert user, novice user and
connoisseur user in which expert user can complete process on its own knowledge
while compared to novice user and connoisseur user because of their less domain
knowledge. The inadequate lead to grow user like novice and connoisseur user
more. With these properties they are not at all more significant users like
expert users. By considering these aspects agents are implemented to perform the
specified task on behalf of less domain user in which selection of appropriate
mining techniques, appropriate algorithms, and proper decision making for
producing better results. In these paper, multiple agents are considered to
perform differ unique task with proper communication and collaboration with each
agents in the name of automated process for better visualization and cluster
detection for less domain users.
|
Keywords |
Data Mining Techniques, Multi –Agent Systems, Agents Role, Visualization
|
Full Text |
|
Title: |
INTELLIGENT FAULT DETECTING SYSTEM IN AN OPTICAL FIBRE
|
Author: |
G.BASKARAN, R.SEETHALAKSHMI
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 178 - 187 Vol 39. No. 2 -- 2012 |
Abstract |
In this intelligent fault detecting system in an optical fibre used to find the
fault in optic fibre line. To design a fault monitoring module and find the
fault in the line says across the customer sides. The idea behind this module is
to monitor the received power supply in optical fibre using a Microcontroller.
Laser output power monitoring circuit is designed using ISIS simulator to
monitor the received power supply in the optical fibre. If there is any abrupt
changes in power of optical line the automatic message will be transmitted to
monitoring person regarding the fault in fibre via of GSM. Here we can operate
Microcontroller in low power mode (sleep mode) to save power consumption.
Automatic message is transmitted to monitoring person about the fault in optical
line.
|
Keywords |
Optical fibre, Microcontroller, Global System for Mobile Communication (GSM).
|
Full Text |
|
Title: |
SIMULATION VOICE RECOGNITION SYSTEM FOR CONTROLING ROBOTIC APPLICATIONS
|
Author: |
WAHYU KUSUMA R., PRINCE BRAVE GUHYAPATI V
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 188 - 196 Vol 39. No. 2 -- 2012 |
Abstract |
Voice recognition is a system to convert spoken words in well-known languages
into written languages or translated as commands for machines, depending on the
purpose. The input for that system is "voice", where the system identifies
spoken word(s) and the result of the process is written text on the screen or a
movement from machine's mechanical parts. This research focused on analysis
of matching process to give a command for multipurpose machine such as a robot
with Linear Predictive Coding (LPC) and Hidden Markov Model (HMM), where LPC is
a method to analyze voice signals by giving characteristics into LPC
coefficients. In the other hand, HMM is a form of signal modeling where voice
signals are analyzed to find maximum probability and recognize words given by a
new input based from the defined codebook. This process could recognize five
basic movement of a robot: "forward", "reverse", "left", "right" and "stop" in
the desired language. The analysis will be done by designing the recognition
system based from LPC extraction, codebook model and HMM training process. The
aim of the system is to find accuracy value of the recognition system built to
recognize commands even the speaker voice isn't currently stored in the
database.
|
Keywords |
Voice Recognition, Robot, LPC, HMM
|
Full Text |
|
Title: |
SELF-ORGANIZED BEHAVIOR BASED MOBILITY MODELS FOR AD HOC NETWORKS
|
Author: |
A. MADANI AND N. MOUSSA
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 197 - 203 Vol 39. No. 2 -- 2012 |
Abstract |
Among the aspects that require a major interest in the simulation of mobile ad
hoc networks (MANETs) is the mobility of nodes in the network. Several mobility
models have been proposed. In this paper, we study the impact of self-organized
behavior based models on the mobility of nodes in the network, through two
models, namely: Flock Mobility model and Leadership mobility Model. The results
are then compared to those provided by the Random Walk Mobility model. The
analysis points to the relationship between two important and interdependent
concepts: mobility of nodes in the network and communication/interference
between nodes.
|
Keywords |
Self-Organized Behavior, Mobility Models, Cellular Automata,
Interference/Communication Between Nodes
|
Full Text |
|
Title: |
NONLINEAR CONTROL OF MPPT AND GRID CONNECTED FOR VARIABLE SPEED WIND ENERGY
CONVERSION SYSTEM BASED ON THE PMSG
|
Author: |
Y.ERRAMI, M.HILAL, M.BENCHAGRA, M.OUASSAID, M.MAAROUFI
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 204 - 217 Vol 39. No. 2 -- 2012 |
Abstract |
The efficiency of the variable speed wind energy conversion systems (WECS) can
be greatly improved using an appropriate control strategy. In this paper,
nonlinear control for wind energy conversion system (WECS) based on the
permanent magnet synchronous generator (PMSG) is investigated in order to
maximize the generated power from wind turbine. The control strategy combines
the technique of maximum power point tracking (MPPT) method and sliding mode
(SM) nonlinear control theory, that as it is well known, presents a good
performance under system uncertainties. The block diagram of the WECS with a
back-to-back PWM converter structure and PMSG is established with the dq frame
of axes. Considering the variation of wind speed, both converters used the
sliding mode control scheme. The objectives of grid-side converter are to
deliver the energy from the PMSG side to the utility grid, to regulate the
DC-link voltage and to achieve unity power factor and low distortion currents,
while a speed controller is designed to maximize the extracted energy from the
wind, below the rated power area. Simulation results show the feasibility and
robustness of the proposed control schemes for PMSG based variable speed wind
energy conversion systems.
|
Keywords |
Lyapunov Theory, Sliding Mode Control, WECS, PMSG, MPPT, Unity Power Factor.
|
Full Text |
|
Title: |
SIGNALS, ANIMAL RESPONSE AND EARTHQUAKE
|
Author: |
MUHAMMED SACUAR HUSSAIN, SYED MD. ASIF
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 218 - 224 Vol 39. No. 2 -- 2012 |
Abstract |
Most of the modern researches on animal behavior dates back to the mid
seventies. There is no significant design or method for understanding “animal
behaviour for earthquake prediction”, though international researches have been
carried out especially in China, Japan and USA. These researches mainly rely on
particular animal behaviour changes due to several simulated earthquake
environment and based on previous earthquake statistics. However, animal
produces numeral behaviours, and it is ambiguous to rely on particular behaviour
rather than elaborate system. This paper aims to describe the circular model of
energy flow during earthquake, similar to a damping oscillator. The energy flow
is compared to a power inverse law .The energy spread at long distances are
linked to animal sensitivity level. Animal signals are produced in the form of
acoustic, chemical, electromagnetic signals. These signals are important in
applying for recognition of nearby calamities such as earthquake. Then methods
of data collection for such parameters are emphasized along with the reason for
targeting a group of species as reference - rather than many. Animals are
sensitive to weak signals which are often unnoticed even by the subtle
instruments and measuring techniques. So a method for earthquake prediction
based on signal variations in nature and animal responses is proposed. The
explicit reason for animal based prediction model is explained in terms of
biological and physical factors. The data collection and processing method is an
integrated model of different types of signal analysing techniques. An
integrated processing method increases efficiency of the system through
comparison and mathematical analysis. Animals react to environmental changes and
their reaction accompanies different behaviour changes. This response to stimuli
involves pheromone secretions and homoeostasis rate variations. An earthquake
induces such changes to highly sensitive species. An integrated analyser can be
used to detect pheromone level along with other environmental variations to
forecast a nearby disaster.
|
Keywords |
Animal reactions, Bessel function of first kind, Kleiber’s law, Earthquake,
Signals
|
Full Text |
|
Title: |
A REVIEW OF AUTOMATED DIABETIC RETINOPATHY DIAGNOSIS FROM FUNDUS IMAGE
|
Author: |
K.NARASIMHAN, NEHA.V.C, K.VJAYAREKHA
|
Source: |
Journal of Theoretical and Applied
Information Technology
pp 225 - 238 Vol 39. No. 2 -- 2012 |
Abstract |
A fundus camera provides digitised data in the form of a fundus image that can
be effectively used for the computerised automated detection of diabetic
retinopathy. A completely automated screening system for the disease can largely
reduces the burden of the specialist and saves cost. Noise and other
disturbances that occur during image acquisition may lead to false detection of
the disease and this is overcome by various image processing techniques.
Following this different features are extracted which serves as the guideline to
identify and grade the severity of the disease. Based on the extracted features
classification of the retinal image as normal or abnormal is brought about. In
literature various techniques for feature extraction and different types of
classifiers have been used to improve sensitivity and specificity. FROC analysis
and confusion matrix are used to evaluate the system performance. In this paper
critical analysis of various algorithms and classifiers is done that have been
used for the automated diagnosis.
|
Keywords |
Diabetic Retinopathy, Fundus Image, Classifier, Sensitivity, Specificity
|
Full Text |
|
|
|