|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
February 2014 | Vol. 60 No.1 |
Title: |
MAXIMIZED RESULT RATE JOIN ALGORITHM |
Author: |
HEMALATHA GUNASEKARAN, THANUSHKODI K |
Abstract: |
A large number of interactive queries are being executed day by day. The user
expects for an answer without no time after the execution. Even in scientific
executions the user needs the intial query results for analysis without waiting
for the entire process to complete. The state-of-art join algorithms are not
ideal for this settings as most of the algorithms are hash/sort based
algorithms, which requires some pre-work before it can produce the results. We
propose a new join algorithm, Maximized Result Rate Join Algorithm (MRR), which
produces the first few results without much delay. It also produces the maximum
join query results during the early stage of the join operation; this is
achieved by exploiting the histogram, which is available in database statistics.
Histogram provides the frequency of the attribute in a table. The tuples which
have high frequency of occurrences are joined during the early stages of the
join operation. Further using the histogram, the join operation can be
terminated when the required matching tuples are obtained. This improves the
overall join performance. Experiment results shows that the new MRR join
algorithm produces 60% more resultant tuples than the hash and sort-merge join
algorithms. It also produces the result 30-35% early than the traditional join
algorithms. |
Keywords: |
Join Query Optimization; Early Result Rate; Maximized Query Result; Histogram;
Query Optimization |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
LAYER SEGMENTATION AND DETECTION OF GA AND DRUSEN FROM SD-OCT IMAGES |
Author: |
MOHANDASS G, R ANANDA NATARAJAN |
Abstract: |
A variety of literatures have been developed to solve the problem of retinal
SD-OCT images segmentation which is a significant stage in an automatic
diagnosis system. Various methods proposed in the literature have met with only
limited success due to overlapping intensity distributions of retinal areas. In
order to achieve this objective, we have proposed a novel technique to segment
three layers, and detection of geographic atrophy (GA) and drusen. Here, we have
proposed a new RELD (Region Enlarging Layer Detection) model for three layers
segmentation. For detection, some statistical features are extracted and neural
network is trained based on the feature to detect two pathologies like, GA and
drusen. Then, scaled conjugate based neural network is used for geographic
atrophy (GA) and drusen detection. The proposed technique is analyzed with the
help of sensitivity, specificity and the accuracy. Finally, in the performance
evaluation, the proposed technique is achieved better accuracy 96.92% when
compared existing technique 76.21. |
Keywords: |
Geographic Atrophy (GA), Drusen, layers, segmentation, RELD model |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
AUCTION BASED NETWORK SELECTION FOR MULTIMEDIA APPLICATIONS |
Author: |
L.RAJESH, K.BHOOPATHY BAGAN |
Abstract: |
Multimedia application have emerged to be one of the most future applications.
The Purpose of this paper is to select the best network in a heterogeneous
environment and to distribute the bandwidth resources to the user based on
multi-media applications in the game theory techniques(Auction game). This paper
presents the framework of characterizing multimedia applications and their
communication requirements. In the traditional method Network Selection is based
on RSS (Received Signal Strength). However, RSS only is not enough to satisfy
the various demands of different users and different multimedia applications.
Though some methods have considered multiple criteria (e.g. QoS, RSS, Utility
function, Bandwidth, User Preference) for selecting the best network. Therefore
user requests bandwidth for multimedia applications there is a need to have
mechanisms to decide which network is the best suitable network for each user at
each moment in time for every application that the user requires. We do not only
consider the utility of users but also model the network operators’ utility and
discuss the truth telling behaviour of network operators in terms of offered
prices and service quality.One of the challenging problems is to choose the
optimal network depending upon the type of the application user going to be
used. Best network selection considers the factors such as Cost, Quality and
Energy. It also deals with multimedia applications to distribute the bandwidth
resource to the users according to their application requests. |
Keywords: |
RSS, WLAN, RAN, WIMAX |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
IMPROVING PERFORMANCE IN FREE SPACE OPTICAL COMMUNICATION (FSOC) CHANNEL THROUGH
THE DUAL DIFFUSER MODULATION (DDM) DUE TO ATMOSPHERIC TURBULENCE |
Author: |
A.K RAHMAN, S.A ALJUNID, ANUAR M.S, FADHIL H.A |
Abstract: |
This paper focus on reduction of atmospheric turbulence effect on free space
optical communication using robust modulation that is dual diffuser modulation (DDM)
technique. This technique uses two transmitter and differential mode detection
at the receiver. The combination of dual diffuser with a differential detection
mode at receiver produce the superior modulation against the turbulence
especially reducing the scintillation index, overcome the signal detection with
fix zero threshold and improve the power received. These three element factors
are important in order to improve the overall performance of free space optical
system. The analysis result show that for receiving power DDM at 3km distance
propagation is 4.59dBm compare with conventional OOK that using diffuser only
-7.6dB which equal to 3dBm improvement or around 40 percent. Meanwhile in term
of BER performance, the DDM can further the distance propagation with
approximately 42 percent improvement. |
Keywords: |
Phase Screen Diffuser, Atmospheric Turbulence, Differential Mode Detection, Dual
Diffuser Modulation, Free Space Optic |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
MODELING AND SIMULATION OF INTERLINE DYNAMIC VOLTAGE RESTORER USING SVPWM
TECHNIQUE |
Author: |
Dr. P. USHA RANI |
Abstract: |
Voltage Deviations, often in the form of voltage sags can cause severe process
disruptions and can result in substantial economic loss. The Dynamic Voltage
Restorer (DVR), a custom power device, has been used to protect sensitive loads
from the effect of voltage sags / swells on the distribution feeder. The DVR’s
main function is to inject the difference in voltage to the power line and thus
maintain the load side voltage at the optimum value. The interline DVR proposed
in this work provides a way to replenish DC link energy storage dynamically. The
IDVR consists of several DVRs connected to different distribution feeders in the
power system. The DVRs in the IDVR system share common energy storage. When one
of the DVR compensates for voltage sag appearing in that feeder, the other DVRs
replenish the energy in the common DC link dynamically. This paper presents the
modeling and closed loop control aspects of the DVR and IDVR systems using Space
Vector Pulse Width Modulation technique working against voltage sags by
simulation.The proposed DVR and IDVR systems are modeled and simulated using
MATLAB/ SIMULINK software. The simulation results show that the control approach
performs very effectively and yields excellent compensation for compensating
voltage sags. |
Keywords: |
Dynamic Voltage Restorer (DVR), Interline Dynamic Voltage restorer (IDVR), Space
Vector pulse Width modulation (SVPWM) |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
PERFORMANCE ANALYSIS OF AN UPLINK MISO-CDMA SYSTEM USING MULTISTAGE MULTI-USER
DETECTION SCHEME WITH V-BLAST SIGNAL DETECTION ALGORITHMS |
Author: |
G.VAIRAVEL, K.R.SHANKAR KUMAR |
Abstract: |
This paper investigates the Weighted Linear Parallel Interference Cancellation (WLPIC)
Multiuser Detection (MUD) scheme with Vertical-Bell Laboratories Layered
Space-Time (V-BLAST) signal detection algorithms in uplink Code Division
Multiple Access based Multiple Input Single Output (MISO-CDMA) system. The Bit
error rate performance of WLPIC scheme with V-BLAST signal detection and
Conventional Linear Parallel Interference Cancellation (CLPIC) scheme with
V-BLAST signal detection are compared in both correlated and uncorrelated
Rayleigh fading channels. The simulation results show that the uplink MISO-CDMA
system performs well with WLPIC MUD scheme and MMSE-SIC signal detection when
compared with CLPIC MUD scheme and ZF-SIC signal detection in different channel
conditions. |
Keywords: |
MISO, CDMA, ZF-SIC, MMSE-SIC, ML Detection, CLPIC, WLPIC |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
BLOCK IDENTIFICATION METHODOLOGY: CASE STUDY ON BUSINESS DOMAIN |
Author: |
MUSTAFA ALMATARY, MARINI ABU BAKAR AND ABDULLAH MOHD ZIN |
Abstract: |
The Block-Based Software Development (BBSD) is a software development approach
that enables end users to develop applications by integrating blocks. In order
for block based programming approach to be successful, there is a need for a
large number of blocks to be developed in various application domains. The BBSD
life cycle divided into two parts: Block development for a specific domain
(carried out by project initiators and block developers), and block integration
(carried out end by users). Block development consists of two stages: block
identification and block creation. This paper describes a methodology that can
be used for block identification. Through this methodology blocks that are
needed for a given domain can be properly determined and specified, which will
help blocks developers to develop the right blocks for the domain. The
feasibility of the proposed methodology is shown through a case study. |
Keywords: |
End User Software Development, UML, Block-Based Software Development,
Component-Based Software Development |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
EFFICIENT INTRUSION DETECTION SYSTEM BASED ON SUPPORT VECTOR MACHINES
USING OPTIMIZED KERNEL FUNCTION |
Author: |
NOREEN KAUSAR, BRAHIM BELHAOUARI SAMIR, IFTIKHAR AHMAD, MUHAMMAD HUSSAIN |
Abstract: |
An efficient intrusion detection system requires fast processing and optimized
performance. Architectural complexity of the classifier increases by the
processing of the raw features in the datasets which causes heavy load and needs
proper transformation and representation. PCA is a traditional approach for
dimension reduction by finding linear combinations of original features into
lesser number. Support vector machine performs well with different kernel
functions that classifies in higher dimensional at optimized parameters. The
performance of these kernels can be examined by using variant feature subsets at
respective parametric values. In this paper SVM based intrusion detection is
proposed by using PCA transformed features with different kernel functions. This
results in optimal kernel of SVM for feature subset with fewer false alarms and
increased detection rate. |
Keywords: |
Intrusion Detection System (IDS), Support Vector Machines (SVM), Principal
Component Analysis (PCA), Polynomial Kernel, Sigmoid Kernel |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
KERNEL OPTIMIZATION FOR IMPROVED NON-FUNCTIONAL REQUIREMENTS CLASSIFICATION |
Author: |
K. MAHALAKSHMI, Dr. R.PRABHAKAR, Dr. V. BALAKRISHNAN |
Abstract: |
Requirements Engineering (RE) refers to development of software-intensive
systems. There are various kinds of software-intensive systems, and RE practice
varies across this range. System requirements are descriptions of services
provided by a system and its constraints. Functional requirements capture the
system’s intended behavior which may be expressed as services, tasks or
functions, a system needs to perform. Unlike Functional Requirements,
Non-Functional Requirements (NFR) state constraints to the system as well as
particular notions of qualities a system might have.NFR analysis is an important
RE activity. This paper uses Support Vector Machine (SVM) for classification of
NFR; Radial Basis Function kernel is optimized using Hybrid Artificial Bee
Colony (ABC).The hybrid ABC incorporates differential evolution. |
Keywords: |
Requirement Engineering (RE), Non-Functional Requirements (NFR), Support Vector
Machine (SVM), Artificial Bee Colony (ABC), Differential Evolution |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
A NOVEL APPROACH FOR TEXT CLUSTERING USING MUST LINK AND CANNOT LINK ALGORITHM |
Author: |
J.DAFNI ROSE, DIVYA D. DEV, C.R.RENE ROBIN |
Abstract: |
Text clustering is used to group documents with high levels of similarity. It
has found applications in different areas of text mining and information
retrieval. The digital data available nowadays has grown in huge volume and
retrieving useful information from that is a big challenge. Text clustering has
found an important application to organize the data and to extract useful
information from the available corpus. In this paper, we have proposed a novel
method for clustering the text documents. In the first phase features are
selected using a genetic based method. In the next phase the extracted keywords
are clustered using a hybrid algorithm. The clusters are classed under
meaningful topics. The MLCL algorithm works in three phases. Firstly, the linked
keywords of the genetic based extraction method are identified with a Must Link
and Cannot Link algorithm (MLCL). Secondly, the MLCL algorithm forms the initial
clusters. Finally, the clusters are optimized using Gaussian parameters. The
proposed method is tested with datasets like Reuters-21578 and Brown Corpus. The
experimental results prove that our proposed method has an improved performance
than the fuzzy self-constructing feature clustering algorithm. |
Keywords: |
Genetic Algorithm, Keyword Extraction, Text Clustering, MLCL Algorithm. |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
AN EFFICIENT ALGORITHM FOR FAULT CLASSIFICATION AND IDENTIFICATION IN ONLINE
TRANSACTION MANAGEMENT |
Author: |
JAVID ALI, ANANDHAMALA |
Abstract: |
Nowadays the customer demand for accessing a web based application has grown
enormously as everything is available in the Internet. Sensitive application
providers retain their resources in safe from unauthorized access by using
single signon technique. In this technique if a user gives an irrelevant
information in a particular session, he may be asked to continue the session by
using sign on technique once again irrespective of whether the user is a
sensitive user. This paper proposes a new strategy which classifies the
sensitive user and allows him to continue the session even if the user does a
mistake (fault) that could be tolerated to some level. The proposed method
focuses on the fault identification and classification in order to keep the
sensitive user for assigning some tolerance level for accessing the web
application. The users are classified based on the access level by setting the
tolerance level for the type of fault identified. An efficient algorithm is
proposed for handling the fault tolerance in the web application more
efficiently. In future the fault tolerance can be extended by AI based technique
with multi-level security for improving the performance over online transaction. |
Keywords: |
Fault Tolerance, Web Application, Fault Tolerant Techniques, Classification,
Access Policy |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
DYNAMIC WORKLOAD-AWARE PARTITIONING IN OLTP CLOUD DATA STORES |
Author: |
SWATI AHIRRAO, RAJESH INGLE |
Abstract: |
Cloud Computing is one of the emerging field for deploying scalable applications
at low cost. This paper presents a scenario of the challenges faced by
application developers for building scalable web applications and provides two
ways in which scalability can be achieved. First is using data partitioning
which plays important role in optimizing the performance and improving
scalability of data stores. Second approach works without explicit data
partitioning. We have surveyed the design choices of various cloud data stores
and analyse the requirements of applications and data access patterns and how
these requirements are fulfilled by scalable database management systems. It
also presents design model for dynamic workload-aware partitioning. In dynamic
partitioning, workload is analyzed from transaction logs and frequent item sets
are found out. These frequent item sets are grouped together and collocated on
one partition to improve scalability. We also provide the implementation of our
partitioning scheme in SimpleDB running in Amazon Cloud and Hbase.We are using
industry standard TPC-C benchmark for evaluation of our partitioning scheme. We
present the experimental results of our partitioning scheme by executing TPC-C
benchmark. |
Keywords: |
Cloud, Scalability, Oltp, Data Partitioning |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
A RECONFIGURABLE SOC ARCHITECTURE FOR SHIP INTRUSION DETECTION |
Author: |
P. LATHA, DR. M. A. BHAGYAVENI, STEFFI LIONEL |
Abstract: |
Monitoring of the marine environment has come to be a field of research in the
last ten years. Wireless sensor network has been one of the most emerging
techniques for the application of surveillance. Wireless Sensor Nodes are highly
attractive for oceanography because they are easy to deploy, operate and
inexpensive. Currently used methods for ship intrusion detection are costly. In
this paper, we emphasize advantages of Wireless Sensor Networks (WSN) in
oceanography. We propose Reconfigurable SoC (RSoC) architecture for ship
intrusion detection. The proposed FPGA-based Wireless sensor node is interfaced
with tri-axis digital accelerometer sensor. Signal processing techniques are
used to detect any unauthorized ships by distinguishing the ship-generated waves
from the ocean waves. A three-level intrusion detection system has been
designed, with which we can detect the intrusion of any ship irrespective of
their sizes. The proposed system is simulated using Xilinx ISE simulator. |
Keywords: |
Wireless sensor network, Surveillance, Oceanography, Accelerometer sensor,
Reconfigurable SoC, |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
QUAD-BAND MICROSTRIP ANTENNA FOR MOBILE HANDSETS |
Author: |
ASEM S. AL-ZOUBI, MOHAMED A. MOHARRAM |
Abstract: |
In this paper, a new quad-band small size microstrip handset antenna covering
global system for mobile communication (GSM900), global poisoning system
(GPS1500), digital communication system (DCS1800), and wireless local area
network (WLAN2450) bands is presented. The antenna has a single feed and a
shorting pin to reduce its size. The design is simulated and optimized for two
different dielectric substrates. Details of the antenna are discussed along with
simulated results. Simulation results are obtained using the HFSS commercial
software which is based on the finite element method and compared to measured
results and good results are obtained. |
Keywords: |
Microstrip Antenna, Multi-Band Antenna, Mobile Handset, Small Antenna. |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
PERFORMANCE ENHANCEMENT OF ENERGY EFFICIENT PROTOCOLS FOR APPLICATION IN WSN |
Author: |
R. SITTALATCHOUMY, L. SIVAKUMAR |
Abstract: |
This paper proposed a modified version of PEGASIS to reduce the energy
consumption in Wireless sensor Networks. Here, the standard PEGASIS was modified
and the performance of the routing was carried. It can be proved that the
routing protocol discussed in this paper required minimum energy compared to
normal PEGASIS . Two possible routing algorithm was implemented and results are
discussed. |
Keywords: |
Wireless Sensor, Energy Efficient Algorithm, Double Cluster Head Algorithm,
Battery life |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
A SYSTEMATIC LITERATURE REVIEW OF END-USER PROGRAMMING FOR THE WEB MASHUP |
Author: |
RODZIAH LATIH , AHMED PATEL , ABDULLAH MOHD. ZIN |
Abstract: |
End-user Programming for the web is currently of interest because Web 2.0
technologies have resulted in a vast array of tools available for mashup making.
This paper presents a Systematic Literature Review of EUP for web mashups. Its
objective is to outline a comprehensive review and synthesis of the literature
related to EUP for web mashups. A Systematic Literature Review was performed of
peer reviewed published studies that focused on research in EUP for Web mashups.
A review was conducted on 21 relevant articles, mostly recent (published between
January 1st 2000 and December 31st 2012) and published in English. Five EUP
approaches for web mashups were identified from the studies; browsing,
programming by demonstration or example, spreadsheet, widget, data-flow and
block-based approach. Other researches regarding EUP for web mashups were also
identified, such as ubiquitous platform mashups, users’ support functions, data
extraction techniques, and process-oriented mashups. |
Keywords: |
SLR, End User Programming, Web 2.0, Web Meshup |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
WEB DOCUMENT CLUSTERING THROUGH METAFILE GENERATION FOR DIGRAPH STRUCTURE USING
DOCUMENT INDEX GRAPH |
Author: |
BUDI, SRI NURDIATI, BIB PARUHUM SILALAHI |
Abstract: |
Clustering techniques are often used to cluster grouping text documents.
Modeling and graph-based representation of the document clustering process can
be done by using algorithms Document Index Graph (DIG). This study aims to
implement the DIG algorithm for designing the structure digraphs used for
graphical representation of web document clustering process. The data used is
the REUTERS-21578 documents. Testing is done by determining the parameter values
for the number of groups of documents to be processed and the determination of
the frequency of occurrence of the word limit. Analysis performed on the stage
of determining the limit frequency of occurrence of relevant words
(inter-cluster) and the occurrence of the word that is not relevant
(intra-cluster) on the document clustering process. Digraph structure that
represents the best graph for document clustering process is achieved in
inter-cluster frequency value 5 and the value of intra-cluster frequency 3
within 25 documents. |
Keywords: |
Algorithm, Clustering, Digraph, Document Index Graph, Reuters Document |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
HIGH THROUGHPUT HARDWARE IMPLEMENTATION FOR RC4 STREAM CIPHER |
Author: |
M.RAMKUMAR RAJA, DR. K. THANUSHKODI, S.ARUL JOTHI |
Abstract: |
This RC4 is the most popular stream cipher in the domain of cryptology. In this
paper, we present a systematic study of the hardware implementation of RC4, and
propose the fastest known architecture for the cipher. We combine the ideas of
hardware pipeline and loop unrolling to design an architecture that produces two
RC4 key stream bytes per clock cycle. We have optimized and implemented our
proposed design using Verilog description, synthesized with 45nm technology. The
proposed design has a total area of 138459um2 and shows a power consumption of
382.0935mW.The proposed circuit has a higher operating frequency of 1.387GHz
compared to 1.22GHz which is 8.37% higher than the conventional RC4 circuit. The
throughput of the proposed RC4 circuit is found to be 22.192Gbps. |
Keywords: |
High Throughput, Cipher, Rc4 Stream, 45nm, 1.387GHZ |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
DESIGN OF ANALOGUE FILTERS USING CYPRESS PSOC |
Author: |
DHANABAL R , BHARATHI V ,POLA SAI KUMAR ,PRANEETH MADHIRA SASI RAMA |
Abstract: |
In this paper, we are going to learn a new technique to implement the analogue
filters using PSoC. It is a mixed-signal array IC chip called Programmable
System-on-Chip, employed from Cypress Semiconductors. Previously to design any
filter we need to implement the circuit using softwares like Cadence, Mentor
Graphics etc and we have to check the functionality of that circuit. Instead of
going for a long process, we can use PSoC software which is a simplest method to
implement the filters. PSoC makes use of the switched capacitor technology with
topology to build second order filters. We can save design time, board space,
power consumption by using this technique. Microcontroller, Analog and Digital
components are integrated in it. PSoC is also known as Software Configurable
Silicon.
This paper consists of filters, types of filters , their functions and the new
method to implement the analog low-pass and analog band-pass filters. |
Keywords: |
Bandpass Filter, Butterworth Filter, Chebyshev Filter,Bessel Filter |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
GEOMETRIC TRANSFORMATIONS AND ITS APPLICATION IN DIGITAL IMAGES |
Author: |
SILVESTRE ASCENCIÓN GARCÍA SÁNCHEZ, CARLOS AQUINO RUIZ, CELEDONIO ENRIQUE
AGUILAR MEZA |
Abstract: |
Digital images usually represent a wide range of phenomena. The area of image
processing has been developed through the theoretical study of the different
transformations manifested in the creation of algorithms that cast real-life
problems. This paper establishes the theoretical aspects of linear algebra:
linear transformations and related. We present some of the most commonly used
transformations on both digital images and their pixel intensity values which
are implemented by the use of Matlab software. Finally, we study some aspects of
numerical interpolation on images. |
Keywords: |
Linear Transformation, Affine Transformation, Processing Spatial Interpolation |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
UTILITY, IMPORTANCE AND FREQUENCY DEPENDENT ALGORITHM FOR WEB PATH TRAVERSAL
USING PREFIX TREE DATA STRUCTURE |
Author: |
L.K.JOSHILA GRACE, Dr. V. MAHESWARI |
Abstract: |
Perceiving the navigational performance of website visitors is an essential
factor for the success in the emerging systems of electronic commerce and mobile
commerce. In this paper, we propose a utility and frequency dependent algorithm
for web path traversal using prefix tree structure. Here, we take the web log
files and extract the details such as time duration that the user visited the
web page and the bookmark information about the web page. The extracted data
from the web log file of different users are considered as input to construct
the prefix tree. Using this prefix tree we mine the continuous sequential
pattern and discontinuous sequential pattern and we calculate the weight value
for each pattern mined using the continuous and discontinuous sequential
pattern. The prefix tree constructed using the existing users can be updated
further with the details of new users. We compare the path traversal efficiency
of our proposed technique with the existing frequency dependent prefix span
technique. |
Keywords: |
Continuous Sequential Pattern, Discontinuous Sequential Pattern, Time Duration,
Bookmark, Prefix Tree |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
A SURVEY ON TRANSACTIONS BASED SECURE ROUTING IN WIRELESS NETWORKS |
Author: |
KOUSALYA G, KUMAR R |
Abstract: |
A number of transaction protocols have been proposed in recent years for
possible use of Wireless Networks in various application areas such as
agriculture, medical, etc. In this paper we have presented wide-ranging review
of these protocols with a particular focus on their transactions model. Further
we have presented a comparison of some of the existing transactions based
secured routing protocols of Wireless Networks. The base criteria for comparison
is routing methodologies and the information used to make routing decisions. All
the protocols have to meet different security attacks, with respect to which the
analyses of secure versions of proposed protocols are discussed. |
Keywords: |
AODV, SEAD, MANET |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
DETECTION MECHANISM FOR DISTRIBUTED DENIAL OF SERVICE (DDOS) ATTACKS FOR ANOMALY
DETECTION SYSTEM |
Author: |
K. SARAVANAN, Dr.R. ASOKAN, Dr.K. VENKATACHALAM |
Abstract: |
DDoS means Distributed Denial of Service. DDoS is the attack to pollute the
network. The attacker creates a large amount of packet to the particular system.
DoS attacks synchronized by a group of attackers will lead to Distributed DoS.
The packets are sending by using the compromised computers. Which are targeted
systems does not response to all the packets because of the DDOS attacks. In
this paper, it is proposed a novel statically method for DDoS attacks that is
based on anomaly detection. Statistical techniques are used to classify packets
followed by rule set model which helps in improving proposed system performance. |
Keywords: |
Distributed Denial of Service Attack (DDoS), Denial of Service Attack (DoS),
Fast Fourier Transform (FFT), Discrete Wavelet Transform (DWT) |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
PREDICTION OF SURVIVAL IN PATIENTS WITH BREAST CANCER USING THREE ARTIFICIAL
INTELLIGENCE TECHNIQUES |
Author: |
CHENG-TAO YU, CHENG-MIN CHAO, BOR-WEN CHENG |
Abstract: |
As medical technology advances, has accumulated a large number of health-related
data. Faced with increasingly complex analytical requirements, predictive data
mining has become an essential instrument for hospital management and medical
research. In this study, the breast cancer dataset is collected from a regional
teaching hospital in central Taiwan between 2002 and 2009. The prognostic
factors composed of 8 attributes including 967 subjects, of which 861 are
survival after treatment. The three techniques, artificial neural networks (ANNs),
support vector machine (SVM) and Bayesian classifier, have been discussed which
is used to investigated and evaluated for predicting breast cancer survival. As
can be seen from the results, the prediction accuracy of a 10-fold cross
validation is 90.31%, 89.79% and 88.64%, respectively. Classification results of
SVM are slightly better as compared to ANN and Bayesian classifier, however,
from a relatively low variance, the results show that the SVM will be the best
prognosis in clinical practice. |
Keywords: |
Breast Cancer, Bayesian classifier, Support vector machines (SVM), Artificial
Neural Networks (ANNs) |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
Title: |
CLASSIFICATION OF COTTON DISEASES USING CROSS INFORMATION GAIN_MINIMAL RESOURCE
ALLOCATION NETWORK CLASSIFIER WITH PARTICLE SWARM OPTIMIZATION |
Author: |
P.REVATHI , M.HEMALATHA |
Abstract: |
This paper is developed based on machine vision system and data mining
techniques to identify the cotton leaf spot diseases. The leaves are most
probably affected by the fungi, viral and bacterial diseases in the leaf spot
areas which plays a vital role of crop situation. This paper clarifies six types
of diseases in the cotton plant. The significance of this research work design
is based on advanced computational techniques to reduce the complexity, cost and
time. The proposed techniques correctly identify the diseases. In preprocessing,
the image resolution value is resized to the 150* 150 pixels. The paper uses
Enhanced Particle Swarm Optimization [EPSO] for feature selection to identify
the affected region of a leaf. The Proposed Skew divergence (statistical method)
is based on calculating the edge, color, texture variance features for analysis
of the affected part of a cotton leaf. The Proposed Cross Information Gain of
Minimal Resource Allocation (CIG-MRAN) Classifier has been used to classify the
six types of diseases and increases the accuracy of the classification system. |
Keywords: |
CIG-MRAN Classifier, Skew divergence Edge, Color, Texture Variance Features,
Cotton leaf, Feature Selection EPSO, Classification. |
Source: |
Journal of Theoretical and Applied Information Technology
10 February 2014 -- Vol. 60. No. 1 -- 2014 |
Full
Text |
|
|
|