|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
January 2014 | Vol. 59 No.1 |
Title: |
PERFORMANCE OF SEPARATED RANDOM USER SCHEDULING (SRUS) AND JOUNT USER SCHEDULING
(JUS) IN THE LONG - TERM EVOLUTION - ADVANCED |
Author: |
AHMED A ALI, Ir. Dr. ROSDIADEE NORDIN, Assoc. Prof. Dr. HUD BINTI ABDULLAH |
Abstract: |
Carrier aggregation (CA) is one of the main features in Long Term Evolution–
Advanced (LTE-A). CA will allow the target peak data rates in excess of 1 Gbps
in the downlink and 500 Mbps in the uplink to be achieved, And the users can
have access to the total bandwidth of up to 100 MHz . The system bandwidth may
be continuous or system consisting of several parts non continuous bandwidth are
aggregated. This paper provides a summary of the supported CA scenarios as well
as an overview of the advanced functionality of CA-LTE with particular emphasis
on the basic concept, control mechanisms, and the performance aspects of (CA).
This paper also demonstrates how CA can be used as an enabler for simple yet
effective frequency domain interference management schemes. In particular, the
interference management is to provide the intervention made significant gains in
heterogeneous networks, envisioning intrinsically uncoordinated deployments from
the home base stations. then, we compared the quality of service (QoS)
performances of two different multi-user scheduling schemes in CA based LTE-A
systems, separated random user scheduling (SRUS) and joint user scheduling
(JUS). The former is simpler but less efficient whereas the latter is optimal
but with higher overhead signaling. Moreover, only one single component carrier
(CC) is required to access for user equipment (UE) in the case of SRUS while all
the CCs must be connected in the case of JUS. Some technical challenges for
implementing carrier scheduling schemes technique in LTE-A systems, are
discussed and highlighted. |
Keywords: |
Long Term Evolution– Advanced ;Carrier aggregation ; separated random user
scheduling; joint user scheduling. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
EFFECT OF ROUTING PROTOCOLS OVER RENOVATED CONGESTION CONTROL MECHANISMS IN
SINGLE-HOP WIRELESS |
Author: |
TANMAYA SWAIN, PRASANT KUMAR PATTNAIK |
Abstract: |
Wireless network uses wireless media to send and/or receive data over air.
Wireless network provides the services for data transmission that may not be
effective with wires. But sometimes the situation demands to cover across the
regions that are beyond the capabilities of typical cabling system. However
traditional TCP is a de facto standard for reliable transmission where
congestion mechanism is itself a challenge. Many renovation protocols namely TCP
Newreno, TCP Vegas are cited in the literature in order to overcome the
shortcomings lies in the traditional TCP. On the other hand two popular routing
protocols namely DSR and AODV are widely used during the movement of the
node(s).This paper experiments by simulating an environment with TCP Newreno and
TCP Vegas as a transport layer protocol and DSR and AODV as a routing protocol
in order to get the better compatibility in Network Layer and Transport layer
protocols. |
Keywords: |
TCP Newreno, TCP Vegas, DSR, AODV, Congestion Control |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
RECOVERING DESIGN-CODE TRACES: AN ACO BASED APPROACH |
Author: |
IMAD BOUTERAA NORA BOUNOUR |
Abstract: |
Traceability is a key issue to promoting software development quality and
productivity. It was recognized as crucial for several software development and
maintenance activities. Despite their importance, traceability links are often
sacrificed during software evolution due to market pressure. In this work we
present an approach to recover them between the design and the implementation.
Our approach recover traces basing on properties’ similarities, it exploits
string edit distance, maximum matching, and Ant Colony Optimization. Evaluations
show promising results of this work. |
Keywords: |
Traceability, Object-Oriented Programming, Software Evolution, Program
Understanding, Software Maintenance, UML, Ant Colony Optimization |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
QOS-AWARE EVALUATION CRITERIA FOR WEB SERVICE COMPOSITION |
Author: |
HOMA MOVAHEDNEJAD , SUHAIMI BIN IBRAHIM , MAHDI SHARIFI , HARIHODIN BIN SELAMAT
, ARASH HABIBI LASHKARI , SAYED GHOLAM HASSAN TABATABAEI |
Abstract: |
Service composition is becoming increasingly pervasive, affecting the way
service computing is utilized. Service composition has become an essential
element of service deployment due to the fact that single services are unable to
fulfill user requirements. Owing to the dramatic growth of services claiming
similar functionalities, creating a value-added composite service from a number
of candidate services to address the desired goals is a challenging task. To
overcome this challenge, various Quality of Service (QoS) aware Web Service
Composition (WSC) approaches have been implemented and have a significant impact
on composition efficiency. However, there is a lack of knowledge on the impact
of such approaches on service composition processes. Hence, this study is aimed
to evaluate existing approaches based on QoS aspects. A mathematical-based QoS-aware
evaluation framework is proposed and tested on the state-of-the-art approaches.
The criteria used for evaluation are first identified from a comprehensive
review of related literature. Multi Criteria Decision Making technique is
applied in order to formulate a new QoS-aware evaluation method for Web Service
Composition approaches based on the identified criteria. The approaches are
evaluated using the proposed method to prove its applicability and correctness.
The results demonstrate how a service composition approach addresses QoS aspects
and assists researchers in outperforming their service composition solutions.
The statistical results show the efficiency and correctness of the evaluation
method. |
Keywords: |
Web Service Composition, Quality of Service, Comparative Evaluation |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
AN ADAPTIVE METHOD FOR ANALYZING AND PREDICTING THE CRIME LOCATIONS BY MEANS OF
AMABC AND ARM |
Author: |
R.SUJATHA, DR. D.EZHILMARAN |
Abstract: |
Since the birth of civilization, offenses of various sorts have been on an
uptrend and no wonder Crime investigation, which is emerging as the supreme law
enforcement procedure , particularly of the Government, to check such menaces to
the society at large, finds itself engrossed in methodical scrutinizes for
spotting and evaluating the designs and tendencies shown by the law-breaking
anti-socials indulging in offense and chaos so as to find appropriate and
accurate preventive actions in due time. Various Techniques were introduced for
the method of Crime analysis and Prevention. But the existing methods have some
drawbacks, i.e. those methods are not considering the precise features to
analyze and predict the high volume crime areas. Hence to reduce the drawbacks
in the existing methods, a new crime location prediction technique is proposed
in this paper. The proposed technique predicts the crime location by analyzing
the crime data by utilizing an Adaptive Mutation based Artificial Bee Colony (AMABC)
algorithm. The AMABC algorithm will use socio-economic factors and clustering
results in the crime location analysis process. Among the predicted crime
locations by the AMABC algorithm, a high crime location is computed by mining
the patterns by using Association Rule Mining (ARM) technique. Thus, the
proposed technique will successfully predict the locations via AMABC and ARM
techniques. In our proposed technique an UCI Machine Learning
Repository-Communities and Crime Data Set will be used for the crime analysis.
The proposed technique will be compared with the existing optimization methods
like GA, PSO and conventional ABC. |
Keywords: |
Crime locations prediction, Artificial Bee Colony (ABC), Adaptive Mutation based
Artificial Bee Colony (AMABC), Association Rule Mining (ARM), GA, PSO, UCI data |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
THE STATE OF THE ART INFORMATION SHARING TECHNOLOGIES FOR SUPPLY CHAIN
MANAGEMENT |
Author: |
MARYAM MOFARRAHI, MASOUD RAHIMINEZHAD GALANKASHI, MAHDIEH MOHAMMADI, ALI
SHAHIDINEJAD |
Abstract: |
Information sharing systems help supply chain management (SCM) by tracking and
identifying an object’s or human’s location in real time. However, prior to
implement an information system, various technologies and the related advantages
and disadvantages must be considered. This paper investigates the state of the
art technologies applied in SCM and presents a study about their benefits and
drawbacks. RFID, GPS, NFC, ZigBee, ultrasonic, UWB, and infrared systems are
discussed and finally Visible Light Communication for different stages of a SCM
is proposed. |
Keywords: |
Supply Chain Management, Visible Light Communication, Information sharing. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
A REVIEW ON ABSTRACTIVE SUMMARIZATION METHODS |
Author: |
ATIF KHAN, NAOMIE SALIM |
Abstract: |
Text summarization is the process of extracting salient information from the
source text and to present that information to the user in the form of summary.
It is very difficult for human beings to manually summarize large documents of
text. Automatic abstractive summarization provides the required solution but it
is a challenging task because it requires deeper analysis of text. In this
paper, a survey on abstractive text summarization methods has been presented.
Abstractive summarization methods are classified into two categories i.e.
structured based approach and semantic based approach. The main idea behind
these methods has been discussed. Besides the main idea, the strengths and
weaknesses of each method have also been highlighted. Some open research issues
in abstractive summarization have been identified and will address for future
research. Finally, it is concluded from the literature studies that most of the
abstractive summarization methods produces highly coherent, cohesive,
information rich and less redundant summary. |
Keywords: |
Abstractive Summary, Sentence Fusion, Semantic Graph, Abstraction Scheme,
Sentence Revision |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
TRANSACTION AWARE VERTICAL PARTITIONING OF DATABASE (TAVPD) FOR RESPONSIVE OLTP
APPLICATIONS IN CLOUD DATA STORES |
Author: |
SHRADDHA PHANSALKAR, Dr. A.R. DANI |
Abstract: |
Online transaction Processing (OLTP) applications are business applications
which are characterized by high-frequency short lived data transactions. In
cloud domain, applications are expected to be highly responsive and low cost
with optimized levels of consistency. Cloud data stores rely on an appropriate
data partitioning scheme to achieve promising levels of responsiveness and
scalability. This work presents a novel, transaction aware, static, vertical
data partitioning scheme based on denormalization which performs well for OLTP
applications in cloud domain. The scheme is implemented and tested on
contemporary cloud data stores i.e Amazon SimpleDB and Hadoop HBase. Our work
also proposes a mathematical specification model for TAVPD based data
partitioning and suggests appropriate evaluation factors for a data partitioning
scheme in cloud database. |
Keywords: |
Partitioning, Selective Consistency, Responsiveness, Consistency Index, Poisson
Distribution |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
IMPROVEMENTS IN NEURAL NETWORK FOR CLASSIFICATION OF WEB PAGES |
Author: |
J. B. LEELA DEVI, Dr. A. SANKAR |
Abstract: |
Web page classification differs from traditional text classification due to
additional information by Hyper Text Markup Language (HTML) structure and the
presence of hyperlinks. While effort was taken to exploit hyperlinks for
classification, web pages structured nature is rarely considered. A noticeable
HTML documents feature is HTML tags and respective attributes that ensure that
HTML documents are viewed in browsers and other user agents. This paper proposes
a semantic-based feature selection to improve web pages search and retrieval
over large document repositories. Web page classification using HTML tags is
evaluated using the 4 Universities Dataset. The features are classified using
Proposed Neural Network. The experimental results show improved precision and
recall with the presented method. |
Keywords: |
Hyper Text Markup Language (HTML), Web page classification, HTML tags, Neural
Network |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
FUZZY LOGIC BASED PV ENERGY SYSTEM WITH SEPIC CONVERTER |
Author: |
S.VENKATANARAYANAN, M.SARAVANAN |
Abstract: |
This paper is to present the Fuzzy Logic based sand –alone solar Photovoltaic
(PV) system with the isolated SEPIC converter. The system is designed for 750W
solar PV panel and to feed an average demand of 250 W. This system includes
solar panels, MPPT (Maximum Power Point Tracking) controller, a SEPIC converter,
an energy storage system and a single phase VSI (Voltage source inverter). The
SEPIC converter provides a constant DC bus voltage and its duty cycle controlled
by the MPPT controller which is needed to improve PV panel’s utilization
efficiency. A fuzzy logic controller (FLC) is used to generate the PWM signal
for the SEPIC converter to extract maximum power. Simulated using MATLAB
SIMULINK, results are presented to demonstrate the performance of the MPPT
controller for various load conditions. |
Keywords: |
PV Energy System, Isolated SEPIC Converter, Fuzzy Logic Controller, Voltage
Source Inverter. Introduction |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
DEVELOPING ARCHITECTURE FOR MONITORING OF PATIENTS USING INTELLIGENT AGENT-BASED
SYSTEM IN WIRELESS SENSOR NETWORK |
Author: |
R.RAJAVIGNESH, DR.G.THOLKAPPIA ARASU |
Abstract: |
Monitoring and diagnosis of patients without directly accessing them is very
important in medical application of wireless sensor network. But, collecting,
transferring and maintaining of data is challenging task. So, there is an
immense need to handle all these information in a quickly and effectively way.
In this paper, we utilize agents in different processes such as, data collection
from the different sensors, transferring, storing to the appropriate servers,
classifying the data and give the prescriptions to the patients. In this paper,
we organize the proposed architecture in three layers, the first one is body
area network layer which helps to store the medical information get from the
body sensors of the patients, the second layer is conveyer layer which helps to
transfer the medical information one layer to another layer and the final one is
data analyze layer which helps to store, analyze medical data also it gives the
exact prescriptions for the medical information of the patient. Finally the,
remedy for the medical information is send through the conveyer layer. Finally
the experimentation is made with the number of patient’s rate of transferring
the medical data and our implementation is done with the help of JADE and JNS. |
Keywords: |
Architecture, Patients, Agent System, Wireless Sensor Network |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
AN ADAPTIVE THRESHOLD INTENSITY RANGE FILTER FOR REMOVAL OF RANDOM VALUE IMPULSE
NOISE IN DIGITAL IMAGES |
Author: |
S.SARAVANAKUMAR, A.EBENEZER JEYAKUMAR, K.N.VIJEYAKUMAR, NELSON KINGSLEY
JOEL |
Abstract: |
A novel approach for denoising digital images corrupted by impulse noise is
presented in this brief. The proposed approach uses an efficient technique to
identify pixels corrupted by random noise. This is done by setting an intensity
range for the center pixel of the selected window and checking whether the
number of pixels which fall within this range is above or below a specified
threshold. If the condition for an uncorrupted pixel fails in the selected
window, the window size is increased and threshold is adaptively changed.
Experimental evaluation using MATLAB revealed that the proposed approach
demonstrates better Peak Signal to Noise Ratio (PSNR) improvement for higher
noise densities when compared to the best of the approaches used for comparison.
Visual interpretation of output images revealed that our approach preserved
edges and fine details when compared to the existing algorithms. |
Keywords: |
Random Valued Impulse Noise, Intensity Range, Soft-switching, Rank order, Peak
Signal to Noise Ratio. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
DESIGN OF A MODIFIED FUZZY FILTERING FOR NOISE REDUCTION IN IMAGES |
Author: |
EHSAN AZIMIRAD, JAVAD HADDADNIA |
Abstract: |
Reducing noise from the color images is a very active research scope in image
processing. In this paper, a modified fuzzy based image filtering algorithm is
proposed for reducing impulse noise. This paper presents a new fuzzy filter for
the removal Impulse noise in color images. For dealing with the Impulse noise,
an algorithm is developed to search for a set of uncorrupted pixels in the
neighborhood of the pixel of interest and to compute the median of this set. A
modified fuzzy filter consisting of two sub filters with novel membership
functions is proposed to cancel out the impulse noise. The first sub filter
detects the noisy pixel by utilizing three fuzzy membership functions, defined
for this purpose. The corrupted pixels are then corrected using the median of
the noise free pixels. The second sub filter makes use of the relation between
different color components of a pixel to remove the residual noise in the color
image. Simulation results shows that the proposed fuzzy filter effectively
removes the additive noise by preserving fine details in the image. |
Keywords: |
Impulse Noise, Fuzzy Filter, Reducing Noise, Median Filter, Membership Functions |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
A TECHNIQUE FOR ICI CANCELLATION USING BAYESIAN PROBABILITY AND PARALLEL
CANCELLATION (BPICI) |
Author: |
K. CHINNUSAMY, K. DURAISWAMY |
Abstract: |
In past decade, the detection and mitigation of Intercarrier interference (ICI)
in MIMO-OFDM system has been studied by various researchers due to the presence
of frequency offset. Literature presents several works for ICI cancellation as
like, iterative receivers, parallel cancellation (PIC) and so on. Among various
techniques, researchers have put much concentration on PIC detectors which is a
non-linear Multi-User Detection technique wherein the transmitted signal of each
user is detected in parallel over a number of iterations. In this paper,
Bayesian Probability based Intercarrier Interference Cancellation techniqueis
proposed. The technique employs Bayesian probability computation process and
parallel cancellation.The transmitted signals are received by the receiver
section which consists oftwo cancellation stages. In the first stage, hard
decision based decoding is carried out for spatial dimensions and in the second,
Bayesian probability is employed after parallel interference cancellation. For
the evaluation purpose, BER curves are plotted. Various curves are obtained by
varying the fading channel, number of user signals and number of antennas
Comparative analysis is also made by comparing to other techniques such as PIC
and MVDR. From the results, we can see that our proposed technique has achieved
better performance by having lower BER. |
Keywords: |
Multiple- Input Multiple-Output (MIMO), Orthogonal Frequency Division
Multiplexing (OFDM), Intercarrier Interference, PIC, Bayesian Probability, BER |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
EFFICIENT MULTIPATH LOCATION AWARE ROUTING PROTOCOL FOR MOBILE AD HOC NETWORKS |
Author: |
DR.A.RAJARAM, K.VINOTH KUMAR |
Abstract: |
Mobile ad hoc network consists of mobile node where no infrastructure exists. If
a node wants to communicate with other node, it needs to obtain its location to
forward the data packets. Due to high mobility nature, communication will get
degraded because of less packet delivery rate. In previous location aware
routing protocols, multipath routing is not deployed with location updation. In
this paper, we propose Efficient Multipath Location Aware Routing Protocol (EMLARP)
to improve the packet delivery rate based on location updation of mobile nodes
and multipath routing. The proposed work consists of three parts. In first part,
cluster enhanced multipath routing is introduced to improve link quality and
load balancing. In second part, multipath routing is predicted based on given
topology. Moreover, multipath routing messages are transmitted to obtain set of
node disjoint paths. In third part, mobile node location is updated with use of
network progression ratio to optimize the network cost. Based on simulation
results from Network Simulation tool (NS 2.34), the proposed protocol achieves
high packet delivery rate, more network lifetime, less delay, less packet
delivery delay and communication overhead than existing schemes. |
Keywords: |
Location Aware, Multipath Routing, Cluster Routing, Link Quality, Load
Balancing, Packet Delivery Rate, Network Cost And Delay |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
FALSE POSITIVE REDUCTION IN COMPUTER AIDED DETECTION OF MAMMOGRAPHIC MASSES
USING CANONICAL CORRELATION ANALYSIS |
Author: |
R. LAVANYA, N. NAGARAJAN, M. NIRMALA DEVI |
Abstract: |
X-ray mammography is the most widely used modality for screening breast cancer
in the early stages. Computer aided detection (CADe) systems intend to help
radiologists in improving the detection rate. However, the drawback of CADe
systems is that they result in a high false positive rate (FPR). In this paper,
a new feature-fusion-based system is proposed for classifying automatically
detected masses in a mammogram as true masses or false positive cases. In this
system, unilateral and bilateral information is fused using a multivariate
statistical technique called canonical correlation analysis (CCA). The proposed
system is validated using a public database called the mammographic image
analysis society (MIAS) database. When compared to unilateral, bilateral and
conventional-fusion based systems, the overall classification performance of the
proposed system is higher by a range of 8%-16%, 12%-16% and 14%-28% in terms of
accuracy, area under curve (AUC) and equal error rate (EER), respectively.
Further, the reduction in FPR for the proposed system is at least 39%, 35% and
33% at true positive rates (TPRs) of 60%, 65% and 70%, respectively. |
Keywords: |
Biomedical Image Processing, Cancer Detection, Decision Support System, False
Positive Reduction, Mammography. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
SELECTION AND AGGREGATION OF INTERESTINGNESS MEASURES: A REVIEW |
Author: |
KOK KEONG BONG, MATTHIAS JOEST, CHRISTOPH QUIX |
Abstract: |
Association Rule Mining is the process of retrieving frequent patterns that
occur in a transaction database. Initially used as a market basket analysis
solution for retail businesses, it has grown to cover many other fields such as
medicine [1, 2], traffic estimation [3] and anomaly detection [4, 5]. An
association rule has two components (antecedent and consequent) which is derived
from a pattern (a set of items). However, it is not clear when investigating a
frequent item set, which items imply the others (i.e., which is antecedent, and
which is consequent). Therefore, several combinations of items as antecedent and
consequent are generated. This leads to a huge amount of association rules being
output by an algorithm for Association Rule Mining. Thus, it is imperative that
data miners require some type of measures to evaluate the “interestingness” of
these rules. There exist in excess of 70 well-known measures and countless other
manually crafted measures in the literature. In this survey, we systematically
discuss the methods which users could use to select or aggregate the
interestingness measures, applicability of such methods and evaluation of the
usage of such methods. |
Keywords: |
Association Rule Mining, Objective Interestingness Measures, Data Mining,
Clustering, Information Retrieval |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
SIGNATURE BASED MINING FRAMEWORK FOR EVENT SEQUENCES AND ITS APPLICATIONS IN
HEALTHCARE DATA USING LANGUAGE MODEL |
Author: |
D.VETRITHANGAM, Dr.N.UMA MAHESHWARI Dr.R.VENKATESH |
Abstract: |
Temporal event signature mining for knowledge discovery is a difficult problem.
In this paper a framework is designed to know a temporal knowledge about the
large scales signature mining of longitudinal heterogeneous event data. This
framework mainly deals with the mining of high order latent event structure and
its relationship within single and multiple event sequences. Here, the
heterogeneous event sequences maps to geometric image by encoding structured
into spatial temporal shape process. A Probabilistic language modeling is used
to extract high order events from large scale data. Also, presents a doubly
constrained conventional sparse coding to learn interpretable and shift
invariant latent temporal event signature. This can be shown by the sparsity in
the data and latent factor model on the β-divergence. An optimization scheme is
also used to perform large scale incremental learning of group specific temporal
event signatures. |
Keywords: |
Data mining(DM), Signature mining(SM), knowledge representation(KR),
Probabilistic Model(PM) , Cluster (C) |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
GENETIC-FIREFLY ALGORITHM TO CONTROL LOAD FLOW OF POWER SYSTEM BY OPTIMAL
LOCATION AND CAPACITY OF UPFC |
Author: |
G.GOKULAKRISHNAN, DR.V. RAMESH |
Abstract: |
An improved firefly algorithm is investigated with the aid of genetic algorithm
(GA). The proposed algorithm improves the load ability of power system with
unified power flow controller (UPFC). Random movement factor of firefly
algorithm is improved by hybridizing the GA with classical firefly algorithm. In
firefly algorithm, the next movement of firefly is depends on the movement
factor which is determined by randomly so the best movement of firefly is
possibility to fails by the distribution of random number. Thus, the best
location of and capacity of UPFC can never able to recognize accurately. So in
this paper, a GA based optimization algorithm is used to determine the optimal
random movement factor of fireflies. Thus, the optimal location and capacity of
UPFC is determined efficiently when compared to traditional fire fly algorithm.
The proposed method implemented in MATALB and the optimal location and capacity
of UPFC is examined as per the variation of voltage, power loss and power
balance of the network. The load power control performance of proposed method is
compared with classical firefly algorithm. |
Keywords: |
Improved Firefly, GA, Load Ability, UPFC, Location, Capacity, And Load Variation |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
EVALUATION OF LINK QUALITY FOR ROUTING IN DSR |
Author: |
N.PRASATH, P.SENGOTTUVELAN |
Abstract: |
MANETs are networks comprising of nodes communicating with each other without
network infrastructure and whose advantage is that they can operate alone or in
coordination with wired infrastructure. This is usually done through gateway
nodes participating for traffic relay in both networks. Application areas are
battlefield deployment, rescue work and civilian applications like outdoor
meetings or Ad-hoc classrooms. This paper shows a process to evaluate link
quality to improve routing by using Dynamic source routing protocol. Considering
wireless links quality, a routing algorithm chooses better paths. In this paper,
the performance of the effect of link-quality metrics is evaluated. The metrics
using a DSR-based routing protocol is studied. |
Keywords: |
MANET (Mobile ad hoc network), DSR (Dynamic source routing protocol), Link
quality |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
SEMANTIC-WEB-BASED SEARCHING APPLICATION FOR DOCTORS SCHEDULE AND FACILITIES IN
HOSPITAL |
Author: |
A.B. MUTIARA, T. PUTRI, W. SILFIANTI, A. MUSLIM, T. OSWARI |
Abstract: |
A Hospital as a health supporting media has main information such as existing
doctors praxis schedule and availability of facilities. Each hospital has its
own format in delivering its information. With the diversity of available
information, it is needed a technology that could be able to combine and uniform
almost the same information and then present it to the user in the form of
mutual relevant to their intended context. The technology used is an ontological
semantic-web based search engine. The semantic web method with ontology approach
is not only capable of understanding the meaning of a word and a concept, but
also the logical relationships between them. In this paper it will be explained
the development of search engine with two kinds of data, which are grabed
directly (live data) by using the concept of Ontology Web Language (OWL) and
manually entered (dummy data) by using the concept of Resource Description
Framework (RDF). |
Keywords: |
Doctor, Ontology, OWL, RDF, Semantic Web |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
CLASSIFICATION OF DIABETIC RETINOPATHY PATIENTS USING SUPPORT VECTOR MACHINES (SVM)
BASED ON DIGITAL RETINAL IMAGE |
Author: |
MUHAMMAD FAISAL, DJOKO WAHONO, I KETUT EDDY PURNAMA, MOCHAMMAD HARIADI, MAURIDHI
HERY PURNOMO |
Abstract: |
Diabetic retinopathy is a micro vascular complication which is characterized by
several changes in the retina. Changes occur in the diameter of the blood
vessel, microaneurysm, hemorrhage exudates, and the growth of new blood vessels.
These changes need to be detected early so that steps for further handling and
treatment can be determined.
Laser therapy is one of the common therapies for patients with Diabetic
Retinopathy. This therapy is a manual examination of the scanned results of the
fundus retinal image. Manual examinations that generate ophthalmologist sight
differ from each other. To overcome this problem, a special program is needed to
analyze the fundus image of the eye.
To create a special program for analyzing the fundus images of the eye required
several stages of research. The study begins by preprocessing eye fundus images,
getting rid of the optic dick form the fundus of the eye and then separating the
vascular tissue of the damaged area of the retina. Damaged areas of the retina
consist of dark and bright lesians . Mathematical morphology methods are used to
detect the presence of dark lesian. To detect the presence of bright lesian a
combination of mathematical morphology, Estimated Background, Colour analysis,
Max-tree and attribute filters are used by utilizing a branch filtering
approach. Fundus image segmentation results are extracted and classified using
Support Vector Machines (SVM) based on microneurysm and exudates features. Eye
fundus images are classified into, Mild Non-Proliferative Diabetic Retinopathy,
Moderate Non-Proliferative Diabetic Retinopathy and Severe Non-Proliferative
Diabetic Retinopathy.
The novelty of this research using maxtree representation and atribute filtering
to enhance image quality for exudate segmentation.
From the classification experiments on patients with diabetic retinopathy the
following sensitivity level were obtained, specificity and AUC above 90%. This
indicates that the research could help opthalmologist in analyzing a retina that
is affected by diabetic retinopathy. The results of the study showed 96.9%
sensitivity, specificity 100%, positive predictive value(ppv) 100%, negative
predictive value(npv) 88.19 and AUC 0.985%. |
Keywords: |
Fundus Features, Diabetic Retinopathy, Classification, SVM |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
AN ANFIS BASED PATTERN RECOGNITION SCHEME USING RETINAL VASCULAR TREE – A
COMPARISON APPROACH WITH RED-GREEN CHANNELS |
Author: |
G. LALLI, Dr. D. KALAMANI, N. MANIKANDAPRABU |
Abstract: |
Research Work: This Article focuses the perspective of retinal blood vessel
classification and detection in digitized retinal images. The color-based
channels like green and red channels extraction processes lead to the enhanced
identification of retinal blood vessel through the performance measures using
Receiver Operating Characteristics (ROC).
Processes with Methodologies: This proposed system uses the digital images of
the databases diaretdb0 and diaretdb1. The retinal vascular structures of these
two databases are very precise and perfect. The resized image further involved
with two different progressions of color-channel (Red and Green channels
particularly) conversion. These extracted features are enhanced with CLAHE
(Contrast Limited Adaptive Histogram Equalization) technique. The morphological
operations performed for optic disk, noise and border removal. The difference
between its’ MATLAB based erosion and dilation processes on basis of Canny-Edge
Detection technique with Threshold value. These kind of performance measure of
the system learned through Receiver Operating Characteristics (ROC) with
Sensitivity (Se) and Specificity (Sp) using True Positive (P), True Negative
(P1), False Positive (N) and False Negative (N1) to be used for determining the
‘Authorization’ or ‘Unauthorization’. The “Count of Pixels” indicates area of
blood vessels in respect with Red and Green channels based features are
calculated for gaining accuracy of the retinal vascular structure of an image.
Result: The performance measure with ROC is applied for retinal blood vessel
identification. This kind of systemic processes may be a new perspective in
classification, detection and identification of retinal blood vessels. |
Keywords: |
Retinal Vascular Structure, ROC, diaretdb0, diaretdb1, Threshold Value, CLAHE,
Morphological Operations |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
OPTIMIZED MULTICAST ROUTING SCHEME FOR MOBILE AD HOC NETWORKS |
Author: |
DR.A.RAJARAM, S.GOPINATH |
Abstract: |
Mobile Ad hoc Network is the indivisible part of wireless network. In the past
few years, the popularity of MANET grows unlimitedly. Due to the presence of
mobility and infrastructure less topology, the vulnerability of ad hoc networks
is introduced unconditionally. So the failures of link, path and node may occur
which leads to lack of communication between the users. Sometimes the malicious
attackers arise to make more damage to network connectivity and produce false
information among the mobile nodes. To overcome this issue, several approaches
are developed to make more efficient routing. But they have not focused on
failures of node, link and path as well as malicious activities at a time. We
proposed Optimized Multicast Routing Scheme (OMRS) to attain balance between the
above said issues. In first phase of this scheme, we develop the detection and
avoidance of malicious attacks is implemented with the predetermined trust value
of node characteristics. We have also introduced the characteristics of
malicious attacks in linear network systems. In second phase, stability ratio of
link, path and node is determined to maintain threshold value which ensures the
resilience to the path failures. By implementing these solutions, we have
achieved better stability and node connectivity towards the ultimate goal of
multicast routing scheme. We implement our proposed scheme within Network
Simulator (NS2.34) tool environment. By using the extensive simulation results,
the proposed scheme achieves better delivery ratio, detection ratio, probability
of failure occurrence and less communication overhead, end to end delay than the
ODMRP, BDP. |
Keywords: |
Multicast, Stability Ratio, Malicious Attackers, Detection Ratio, Delivery
Ratio, Communication Overhead, End To End Delay, Threshold Value, Node
Familiarity And Node Proposal. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
SECURE AUTHENTICATION BASED MULTIPATH ROUTING PROTOCOL FOR WSNS |
Author: |
N.SUMA, DR.T.PURUSOTHAMAN |
Abstract: |
Wireless sensor networks consist of some nodes that have limited processing
capability, small memory and low energy source. These nodes are deployed
randomly and often densely in the environment. In order to avoid malicious
activities, the secure authentication scheme is required. Here, we have proposed
the Secure Authentication based Multipath Routing Protocol (SAMRP) for improving
network lifetime and providing data integrity in WSNs. It consists of three
phases. In first phase, multipath route is integrated to ensure load balancing
and avoid isolated failures. In second phase, encryption and decryption scheme
is implemented to provide better authentication. Here three types of iterations
are used during authentication phase. In third packet format is proposed for
monitoring integrity and authentication status. So the efficient secure
multipath route can be chosen to improve the network performance. By simulation
results, the proposed SAMRP achieves better data delivery rate, improved network
lifetime, high packet integrity rate, less end to end delay and overhead in
terms of mobility, pause time, throughput, and number of nodes than the our
previous scheme EMRTEM and existing scheme ADAPT. |
Keywords: |
WSN, SAMRP, Encryption And Decryption, Multipath Routing, Packet Integrity Rate,
Network Lifetime, End To End Delay, Communication Overhead, Throughput And Data
Delivery Rate. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
Title: |
GLOVE BASED COMMUNICATION SYSTEM FOR HEARING AND SPEECH IMPAIRED COMMUNITY USING
SIGN LANGUAGE INTERPRETATION |
Author: |
S. SAYEED, J. HOSSEN, A. HUDAYA, M. F. A. ABDULLAH, N. MUHAIMAN, I. YUSOF |
Abstract: |
This paper is to enlighten the diversified utility of Data glove as a
rehabilitation communication tool as well as a novel and crucial way of secure
authentication for hearing and speech impaired community. Apart from that, this
system can be used as an interface for robotic control by the aged and disable
people which enables the robots to act according to the affective signs. In this
novel way of communication where we combined the conventional sign language used
by deaf and dump community incorporated with an electronic glove as the medium
of communication from which signals are taken and the exacted message is
interpreted. The experiments using Singular Value Decomposition (SVD) method of
feature selection was showing the evidence of potential application of this
idea. The data glove experiments of identifying sign language communication and
identifying people were resulted in significant and distinguished models that
are suitable for any communication and authentication systems. |
Keywords: |
SVD, American Sign Language, Data Glove, Eucildean Distance, Chebychev Distance,
Mahalanobis Distance. Minikowski Distance |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 1 -- 2014 |
Full
Text |
|
|
|