|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
December 2013 | Vol. 58 No.1 |
Title: |
MODELING OF BROADBAND LIGHT SOURCE FOR OPTICAL NETWORK APPLICATIONS USING FIBER
NON-LINEAR EFFECT |
Author: |
G GEETHA, I LAKSHMI PRIYA, M MEENAKSHI |
Abstract: |
Vision towards establishing an all fiber configuration is the motivation behind
this work. With the increasing need for high capacity communication systems,
Broadband sources have become a necessity. Broadband optical sources are an
integral part of multichannel high speed fiber optical communication networks
based on all-optical WDM and OCDM. FWM (Four-Wave Mixing) effects and SC (Super
Continuum) phenomenon in fibers are used in the design of broadband optical
sources. The spectral slicing of the broadband spectra has been proposed in
literature as a simple technique to create multi-wavelength optical sources for
wavelength division multiplexing applications. The objective of this work is to
develop an accurate model for simulating FWM and SC based broadband optical
spectra and compare their performances. The modeling work is carried out using
SIMULINK in MATLAB 7.10.0(R2010a). |
Keywords: |
Four Wave Mixing, Super Continuum, Non Linearity, SMF, DSF, PCF |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
OPTIMIZED RESOURCE BLOCK ALLOCATION AND SCHEDULING FOR REAL-TIME SERVICES IN LTE
NETWORKS |
Author: |
KANAGASUNDARAM.K, G.M. KADHAR NAWAZ |
Abstract: |
In recent times, there has been a huge increase in the demand for real-time
multimedia applications from mobile users. In this paper, we propose an
Optimized Resource Block Allocation and Scheduling technique for real-time
Services in LTE networks. This technique considers both the resource block
allocation and the scheduling process. The resource block allocation considers
the instantaneous data rate and the average data rate. It will allocate the
resources that are required to perform the real-time connection. If the
resources are busy then, the user connection is scheduled using the lower level
of the scheduler. The scheduler has a timer based on which the user connections
are updated. In the scheduling period, the available resources are assigned to
the user. The advantage of this approach is that it is possible to assign the
reserved blocks to real-time users so that average throughput is improved. By
simulation results, we show that the proposed technique improves received
bandwidth and fairness while reducing the delay and packet drops. |
Keywords: |
Long Term Evolution (LTE) Networks, Quality of service (Qos), Call Admission
Control (CAC), Resource Block (RB) Allocation |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
A NEURAL NETWORK-BASED SVPWM CONTROLLER FOR A TWO-LEVEL VOLTAGE-FED INVERTER
INDUCTION MOTOR DRIVE |
Author: |
AHMED A. HASSAN, SAMIR DEGHEDIE, MOHAMED EL HABROUK |
Abstract: |
In this paper, a detailed description of a neural network-based implementation
of the SVPWM algorithm for two-level voltage source inverters is proposed using
a modular approach which facilitates the expansion of the scheme to higher
levels SVM, each step in the SVPWM algorithm is achieved using a simple
feed-forward artificial neural network that consists of one or more layers.
Simulation results are provided by employing the proposed scheme inside a closed
loop V/Hz drive system. |
Keywords: |
Space Vector Modulation (SVM), Artificial Neural Networks (ANNs), Two-Level
Inverter |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
DESIGN AND MODELING OF MB-OFDM UWB WITH DIGITAL DOWN CONVERTER AND DIGITAL UP
CONVERTER FOR POWER LINE COMMUNICATION IN THE FREQUENCY BAND OF 50 MHZ TO 578
MHZ |
Author: |
R.KALAIVANI, Dr.N.J.R.MUNIRAJ |
Abstract: |
Over the last few years Power Line Communication has gained importance for high
speed data communication. One of the major concerns in PLC is noise and high
data rate, several schemes have been adopted to minimize noise over PLCs and
improve data rate. One of the promising approach is the use of Multi Band (MB) –
Orthogonal Frequency Division Multiplexing (OFDM) Ultra Wide Band (UWB) to
achieve date rates over 200 Mbps. One of the approaches is to integrate MBOFDM
UWB with Digital Down Converter (DDC) for power line communication. In this
work, MBOFDM UWB system is integrated with DDC on the transmitter and DUC is
integrated at the receiver to achieve higher data rate. The integrated system
modeled using Matlab and Simulink achieves BER of10 -3 and THD of 1.2. Rate 2/3
Convolutional encoder, scrambler, bit interleave, GMSK modulator and IFFT have
been used generate MBOFDM, time-frequency kernel with frequency translation
achieves UWB. DDC consisting of CIC and CFIR are used to downlink the UWB signal
as compatible with PLC. |
Keywords: |
MBOFDM, UWB, PLC, Integrated System, Down Converter, High Data Rate |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
RSBR: A PARADIGM FOR PROFICIENT INFORMATION RETRIEVAL USING QSPA AND RELEVANCE
SCORE BASED RANKING |
Author: |
SRIDHARAN. K, M. CHITRA |
Abstract: |
Due to the speedy growth of content volume over the internet, the required
content that is relevant to the user’s query is retrieved with difficulty by the
common search engines. To overcome this limitation, semantic web search
approaches are utilized. Many researches in semantic web depend on data search
centered meaning. The general purpose of these researches is to enhance the
current data search and retrieval techniques. An effective Relevancy-based
Semantic Search Engine (RSSE) prototype that allows the users to determine
relevant resources and services by semantics is proposed. The proposed approach
uses Query Similarity Prediction Algorithm (QSPA) for efficient information
retrieval with minimum processing time. The technique serves multiple remote
users. The relevancy based ranking of documents depending on the occurrence of
semantic terms is performed in QSPA. The experimental results show that the
approach is efficient when analyzed with parameters like precision, recall,
F-measure, and the time needed to obtain query results. |
Keywords: |
Information Retrieval (IR), Service Level Agreement (SLA), Semantic web, Query
Similarity Prediction Algorithm (QSPA), Ranking, Cache server. |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
ENHANCED ROAD SECURITY USING EDGE DETECTION AND INFRARED IMAGERY |
Author: |
MITHILA HARISH, ABHISHEK GUDIPALLI, RAMASHRI TIRUMALA |
Abstract: |
Edges are prominent features in images and their detection plays main role in
computer vision and image processing. Identifying and Detecting the edges are a
low level task in a variety of applications such as shape recognition, image
compression, enhancement and restoration. The technique laid out aims to develop
enhanced techniques for road safety. Road safety is a major issue and many lives
are lost due to inadequate safety measures, including poor lighting conditions,
road conditions,etc. The methods laid out aim to provide enhanced safety that
can be further used in enhancing road safety with the potential to avoid
accidents. These are performed using various criteria involving concepts of edge
detection, infrared imaging and real time imaging. In addition to this, it also
analyses other safety techniques and algorithms that have been previously
developed in this line of research. One of the methods laid out caters to the
problems colour blind people have, specifically in distinguishing between the
green light in traffic signals and street lights. |
Keywords: |
Infrared, Edge Detection, Log, Power Law Transform, Real Time Imaging |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
AN APPROACH FOR ASPECT-ORIENTED SKELETON CODE GENERATION FROM REUSABLE ASPECT
MODELS |
Author: |
ABID MEHMOOD, DAYANG N.A. JAWAWI |
Abstract: |
Model-driven code generation has been a topic of interest for researchers owing
to its several benefits including the anticipated reduction in development
effort and delivery time. It has taken a good deal of time to produce techniques
that generate executable code in object-oriented programming languages.
Aspect-oriented software development techniques, though expected to enhance
software development in many ways, still lack approaches that can deliver
model-driven code into one of the aspect-oriented programming languages such as
AspectJ. In this paper, we present an approach for generation of aspect-oriented
code from Reusable Aspect Models. As first step towards the code generation, we
have developed a formal and semantically equivalent text-based representation of
the aspect models using XML schema notation. Further, we have proposed an
approach that takes the XML representation of the aspect models to generate
aspect-oriented skeleton code. Currently, our approach can be used to obtain
complete aspect structure, interfaces, classes, constructors, fields and stubs
of methods specified in the structural part of an aspect. |
Keywords: |
Aspect-Oriented Modeling; Model-Driven Engineering; Aspect-Oriented Code
Generation; Reusable Aspect Models |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
OPINION MINING USING DECISION TREE BASED FEATURE SELECTION THROUGH MANHATTAN
HIERARCHICAL CLUSTER MEASURE |
Author: |
JEEVANANDAM JOTHEESWARAN, DR. Y. S. KUMARASWAMY |
Abstract: |
Opinion mining plays a major role in text mining applications in consumer
attitude detection, brand and product positioning, customer relationship
management, and market research. These applications led to a new generation of
companies and products meant for online market perception, reputation management
and online content monitoring. Subjectivity and sentiment analysis focus on
private states automatic identification like beliefs, opinions, sentiments,
evaluations, emotions and natural language speculations. Subjectivity
classification labels data as either subjective or objective, whereas sentiment
classification adds additional granularity through further classification of
subjective data as positive/negative or neutral. Features are extracted from the
data for classifying the sentiment. Feature selection has gained importance due
to its contribution to save classification cost with regard to time and
computation load. In this paper, the main focus is on feature selection for
Opinion mining using decision tree based feature selection. The proposed method
is evaluated using IMDb data set, and is compared with Principal Component
Analysis (PCA). The experimental results show that the proposed feature
selection method is promising. |
Keywords: |
Opinion Mining, Imdb, Inverse Document Frequency (IDF), Principal Component
Analysis (PCA), Leaningr Vector Quantization(LVQ). |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
A NOVEL DYNAMIC RELIABILITY OPTIMIZED RESOURCE SCHEDULING ALGORITHM WITH FAULT
TOLERANT APPROACH FOR GRID COMPUTING SYSTEM |
Author: |
U. SYED ABUDHAGIR, Dr.S.SHANMUGAVEL |
Abstract: |
In this paper, we design global optimization model and fault tolerance service
for grid computing system. It is provided as a promising model for grid resource
scheduling algorithm. It aims at solving the problem of optimally allocating
services on the grid to optimize the grid service reliability, deadline and
cost. In this paper, the problem of optimizing the reliability of grid systems
has been modeled as a multi-objective optimization problem where apart from the
grid system reliability; the system cost and redundancy are also considered as
objectives. For the reliability analysis of the grid system, we consider failure
rate of computational resources and network resources. Based on the service
reliability of the grid system, Our IGA-RORS algorithm selects the set of
optimal resources among the candidate resources based on reliability,
application execution time and cost that achieves optimal performance using an
immune genetic algorithm. In our algorithm, we design and implement a fault
tolerance service; it guarantees the completion of the applications in the
optimally selected resources. We have demonstrated the test results in java
based GridSim tool. |
Keywords: |
Grid Computing System, Resource Management System, Reliability, Redundancy, IGA,
Fault Tolerant service. |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
LTE SYSTEM TEST-BED AND EXPERIMENTAL RESULTS |
Author: |
MOHAMMED JALOUN, KAOUTAR BOUKHRISSI, ZOUHAIR GUENNOUN |
Abstract: |
This paper presents the result of an LTE trial test case that has been performed
for morocco Telecom operator which intends to rollout the LTE starting from
2014, the aim is to verify the functionality and performance of LTE equipment in
realistic deployment scenarios, with a combination of stationary tests and drive
tests, and it focuses on Single User Throughput, Sector Throughput, Latency, and
MIMO performance verification. |
Keywords: |
LTE, Test Bed, Experimental Result, Throughput, Latency, MIMO |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
ANALYSIS OF LARGE-SCALE CROWD EVACUATION MODELLING |
Author: |
MARTIN LOPUŠNIAK |
Abstract: |
Evacuation models are used for building designs, but also for analyses of
uncommon events, which happened. Results of these analyses show us a big amount
of data, which provide us a view to evacuation process. Despite the knowledge
that the evacuation is a complex problem, which depends on many factors, there
is a question whether there are general evaluation parameters for seemingly
different evacuations, which could be applicable for all cases. The paper
describes the analysis of 58 different evacuation scenarios performed by the
buildingEXODUS evacuation model. The focus is oriented for finding relationships
between chosen evaluation parameters. The results showed that there are also
mutual evaluation parameters, by which evacuations in any building can be
described. The results showed there are differences between evacuations, and so
it is needed to differentiate a vertical evacuation and horizontal evacuation. |
Keywords: |
Evacuation, Modeling, Building, Analysis, Scenario |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
AN EFFICIENT METHOD FOR RETRIEVING CLOSED FREQUENT ITEM SETS USING HATCI
ALGORITHM |
Author: |
MALA. A, Dr F.RAMESH DHANASEELAN |
Abstract: |
A Data Stream is a real-time ordered sequence of transactions, which grows
continuously and in non-constant rapid rate. The size of data stream is
un-bounded. The item sets that appear in a data set with a frequency equal to or
above a specified threshold are called frequent item sets. It is difficult to
mine all frequent item sets, but easy to mine the closed frequent item sets. The
number of Frequent Item sets (FI) grows exponentially when the support threshold
is reduced. In such cases, identifying only the Closed Frequent Item sets (CFI)
is a better choice as the number of CFIs is very less compared to that of FIs
and at the same time the complete information about the FIs can be extracted
from the CFIs. In this paper, a new algorithm named as HATCI Algorithm, is used
for generating the table for Closed Item sets(CI). Along with this table, a
table of supersets and subsets of the CIs and a separate transaction table to
store the CIs generated by each transaction, are also maintained. For generating
and maintaining these tables, a sliding window model is used, that performs only
one scan over the data stream. The size of the transaction table is equal to the
size of the sliding window. Whenever the user requests for the CFIs, the CI
table is accessed sequentially, and then all the CIs are checked out. The CIs
with support count greater than or equal to the support threshold are extracted
and returned as CFIs. The proposed methodology is implemented using JAVA
platform. The experimental results show that the proposed methodology can
effectively retrieve the CFIs on the user’s request. |
Keywords: |
Data Streams, Closed Frequent Item sets, Sliding window, HATCI Algorithm,
AddNewTransaction Algortihm, RemoveOldestTransaction Algorithm. |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
STATISTICAL LINE BASED PALMPRINT RECOGNITION USING ELLIPTICAL GABOR FILTERS |
Author: |
S.ASHA, C.CHELLAPPAN |
Abstract: |
Most of the previous research using palmprint as a biometric trait for personal
authentication has concentrated on enhancing accuracy. In this paper, to speed
up the recognition process, we propose a novel method using the statistical line
based approach to improve the efficiency of the Palm Code. Normally, palm codes
from different palm images are similar. The structural similarities between
palmprints will reduce the performance of the palmprint identification system.
Hence, to avoid the correlation between the palm codes, two elliptical Gabor
filters with different orientations are used to extract the phase information,
and two elliptical Gabor filters are used for the Fusion Code and the
Orientation Code. After the Fusion Code and the Orientation Code have been
obtained, they are fused to obtain a single feature vector, Palmprint Phase
Orientation Code. The similarity between two palm images is measured, using the
normalized hamming distance. Using the Hong Kong PolyU palmprint database our
experimental results show that the proposed method gives a promising result. |
Keywords: |
Authentication, Palmprint Recognition, Gabor Filter, Palmcodes. |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
DIGITAL IMAGE PROCESSING USING SOBEL EDGE DETECTION ALGORITHM IN FPGA |
Author: |
DHANABAL R,BHARATHI V , S.KARTIKA |
Abstract: |
Image processing is important in modern data storage and data transmission
especially in progressive transmission of images, video coding
(teleconferencing), digital libraries, and image database, remote sensing. It
has to do with manipulation of images done by algorithm to produce desired
images. Digital Signal Processing (DSP) improve the quality of images taken
under extremely unfavourable conditions in several ways brightness and contrast
adjustment, edge detection, noise reduction, focus adjustment, motion blur
reduction etc. The advantage is that image processing allows much wider range of
algorithms to be applied to the input data in order to avoid problems such as
the build-up of noise and signal distortion during processing. Digital image
processing has applications reaching out into our everyday life such as
medicine, surveillance, automated industry inspection and many more.
Implementing such applications on an application specific hardware offers much
greater speed than on a general purpose computer where it can be done easier. In
this project, implementation a co-processor for image processing is done. The
co-processor is modeled for edge detection of images. The Edge detection
algorithm will be implemented on an FPGA, where the inherent parallelism offers
a better performance. The architecture is like ARM processor which acts as the
master is having the images which has to be processed. ARM will transfer the
image to the FPGA for processing and after the image is being processed, the
FPGA will display the processed image through a VGA display. The image send by
ARM will be stored in an instantiated memory in FPGA. Edge detection core
implemented in FPGA then reads the image from memory, process it and stores the
processed image back in the memory. VGA controller designed reads the processed
image from the memory and displays it. Sobel Edge detection algorithm is used
for edge detection of images which is efficient in getting smooth edges and also
less sensitive to noise. |
Keywords: |
Image processing, Digital Signal Processing(DSP), Co-processor, Field
Programmable Gate Array(FPGA), ARM processor, VGA controller, Sobel Edge
detection algorithm |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
ENERGY-EFFICIENT CLUSTERING BASED ON HYBRID EVOLUTIONARY ALGORITHM IN WIRELESS
SENSOR NETWORK |
Author: |
KHALIL BENNANI, DRISS ELGHANAMI, ABDELILAH MAACH |
Abstract: |
Energy-aware algorithms are important factors for extending the lifetime of the
wireless sensor network. In energy concerned fields, network clustering has
proved to be an efficient technique that renders structures of low consumption.
Yet, clustering protocols face a major issue that is of grouping sensor nodes in
an optimal way. This is an NP-Hard problem which necessitates evolutionary
algorithms in order to solve. In this paper, we explore a new hybrid
optimization algorithm to decrease the energy consumption, in which modified
particle swarm optimization and simulated annealing are combined to find the
optimal clusters based on transmission distance. Simulated annealing is used as
a local search around the best solutions of the modified particle swarm
optimization. The simulation results show that our proposed protocol can improve
the lifetime of systems compared with existing clustering protocols. |
Keywords: |
Wireless Sensor Network, Clustering, Evolutionary Optimization Algorithm,
particle swarm optimization, Energy Efficiency. |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
A CONVICTIVE FRAMEWORK FOR QUALITY BASE CONSTRUCTION AND EVALUATION OF
E-LEARNING WEBSITE |
Author: |
R. SILAMBANNAN, DR. M. V. SRINATH |
Abstract: |
Websites are playing a strategic role in different domains such as education,
government, business, etc. They support the communication and interaction among
organization and users. However, the usability of the website by the users
depends highly on the quality and design of the site. In order to construct a
website that fulfills the quality requirement of users, authors of this article
has evolved an efficient framework for constructing an e-learning website. The
framework learns the quality parameters that are required by the end users from
various quality models and through survey taken by different websites. Based on
the knowledge acquired through learning, an e-learning website is constructed
that satisfies the user’s quality desires. In addition to building a website,
the framework also collects feedback from end user. Feedbacks are analyzed to
find the efficiency and quality of the website. To find the effectiveness of the
constructed website, authors evaluated the website using quality parameter.
Experimental result presented in this article shows the quality of the
constructed e-learning website and their usability. |
Keywords: |
Framework, E-learning website, feedback, quality parameters, quality evaluation |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
SENTIMENT ANALYSIS OF NATIONAL EXAM PUBLIC POLICY WITH NAIVE BAYES CLASSIFIER
METHOD (NBC) |
Author: |
FRITS GERIT JOHN RUPILELE, DANNY MANONGGA, WIRANTO HERRY UTOMO |
Abstract: |
The national exam (UN) is a government policy to evaluate the education level on
a national scale to measure the competence of students who graduate with those
of other schools at the same educational level. The policy in conducting the UN
is always a topic of discussion and phenomenon that is covered in various media,
because it causes various problems that become pro and contra in society. An
analysis sentiment or opinion mining in this research is applied to analyze
public sentiment and group polarities of opinions or texts in a document,
whether it shows positive or negative sentiment, in conducting the UN. The
analytical process and data processing for document classification uses two
classification methods: the quintuple method and one of the methods for learning
machines, the Naive Bayes Classifier (NBC) method. The data used to classify
documents is in the form of news text documents about conducting the UN, which
is taken from online news media (detik.com). The data gathered comes from 420
news documents about conducting the UN from 2012 and 2013. Based on the
analytical results and document classifications, it has been found that public
sentiment towards carrying out the UN in 2012 and 2013 shows negative sentiment.
The results of the data processing and document classification of conducting the
UN overall show a positive opinion of 32% and a negative opinion of 68%. The
results of the document classification are based on the polarization of public
opinion about carrying out the UN, which reveal in the opinion category of
carrying out the UN in 2012 that there was a positive opinion of 44% and a
negative opinion of 56%. Meanwhile, for conducting the UN in 2013, there is a
positive opinion of 20% and a negative opinion of 80%. These results reveal an
increase in negative public sentiment in conducting the UN in 2013. |
Keywords: |
Sentiment Analysis, Opining Mining, National Exam, Quintuple, Naive Bayes
Classifier |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
USER DATA RATE BASED VERTICAL HANDOFF IN 4G WIRELESS NETWORKS |
Author: |
A.FERDINAND CHRISTOPHER, DR.M.K.JEYAKUMAR |
Abstract: |
Wireless Heterogeneous Networks are integrated within fourth generation
recently. The 4G wireless communication system should assure a few of QoS
related facilities such as offering high data rates, seamless mobility, strong
RSS. When accomplishment and requisite of a user is acknowledged the system gets
succeed in handoff and seamless connectivity. In such a user requisite data rate
plays an imperative role in wireless networks. This paper proposes a novel
method for vertical handoff in 4G wireless heterogeneous networks based on data
transfer rate. In this paper distance based RSS is used as another parameter to
initialize the handoff and an efficient data recovery technique is proposed. Log
based recovery technique saves the logs in both mobile node and base stations
for fast and lossless data recovery. Moreover, feasibility of this method is
compared with existing method in terms of bandwidth and recovery time. |
Keywords: |
Vertical Handoff, 4g Wireless Network, Data Rate Based, Data Recovery, Data Loss |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
CORE BASED COMMUNICATION WITH QoS SUPPORT |
Author: |
ASHOK KUMAR BHOI, SATYA PRAKASH SAHOO, MANAS RANJAN KABAT |
Abstract: |
The present day is demanding more and more quality of service in broadband group
communication to support huge access of internet service and multimedia
application. The core based solution is able to full fill this demand a lot. In
this paper effort has been put to make it more flexible in comparison to
SPAN/COST through a new approach which can be an alternative solution of
SPAN/ADJUST for solving the constrained of non singular core solution. |
Keywords: |
Multicasting, QoS Routing, Core selection |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
A FRAMEWORK OF AN EXPERT SYSTEM FOR CROP PEST AND DISEASE MANAGEMENT |
Author: |
KAMARUDIN SHAFINAH, NORAIDAH SAHARI, RIZA SULAIMAN, MOHD SOYAPI MOHD YUSOFF,
MOHAMMAD MOHD IKRAM |
Abstract: |
Crop pest and disease diagnosis are amongst important issues arising in the
agriculture sector since it has significant impacts on the production of
agriculture for a nation. The applying of expert system technology for crop pest
and disease diagnosis has the potential to quicken and improve advisory matters.
However, the development of an expert system in relation to diagnosing pest and
disease problems of a certain crop as well as other identical research works
remains limited. Therefore, this study investigated the use of expert systems in
managing crop pest and disease of selected published works. This article aims to
identify and explain the trends of methodologies used by those works. As a
result, a conceptual framework for managing crop pest and disease was proposed
on basis of the selected previous works. This article is hoped to relatively
benefit the growth of research works pertaining to the development of an expert
system especially for managing crop pest and disease in the agriculture domain. |
Keywords: |
Agriculture, Document Analysis, Expert Systems, Framework, Pest And Disease
Diagnosis |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
FROM CITESEER TO CITESEERX: AUTHOR RANKINGS BASED ON COAUTHORSHIP NETWORKS |
Author: |
DALIBOR FIALA |
Abstract: |
CiteSeer was a digital library and a search engine gathering its mainly computer
science research papers from the World Wide Web. After a few years of
stagnation, it was definitely replaced with a new version called CiteSeerX in
April 2010. As both CiteSeers provide(d) freely available metadata on the
articles they index(ed), it is possible to analyze two different data sets to
see the differences between CiteSeer and CiteSeerX. More specifically, we
examined the article metadata from CiteSeer (downloaded in December 2005) and
from CiteSeerX (harvested in March 2011) with a view of creating rankings of
prestigious computer scientists. Since the free article metadata acquired from
the Web site of CiteSeerX differ from those in CiteSeer in that they do not
systematically include cited references, the only possibility of creating such
rankings is to base them on the coauthorship networks in both CiteSeers. In this
study, we produce these rankings using 12 different ranking methods including
PageRank and its variants, compare them with the lists of ACM A. M. Turing Award
and ACM SIGMOD E. F. Codd Innovations Award winners and conclude that the
rankings generated from CiteSeerX data outperform those from CiteSeer. |
Keywords: |
CiteSeer, CiteSeerX, Coauthorships, Citations, Researchers, PageRank |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
DESIGN CONSIDERATIONS FOR CLUSTERING LOCALIZATION TECHNIQUE IN AN INDOOR
WIRELESS NETWORK |
Author: |
SITI ZAKIAH HASAN, ROSDIADEE NORDIN |
Abstract: |
Indoor positioning technique, also known as localization is an application that
can benefit from the Wireless Local Area Network (WLAN) infrastructure and WLAN
enabled devices. This paper discusses implementation of an indoor wireless
localization network by using the existing WLAN infrastructure based on a novel
clustering technique. Design criteria and parameters involved to form a cluster
group has been defined and presented. The preliminary results from this study
indicates the proposed clustering technique able to perform indoor localization,
whereby total of 17 cluster groups have been generated from the 82 anchor points
in an indoor area. Several proposal enhancements also presented in order to
enhance the accuracy and improve the selection criteria of the clustering
method. |
Keywords: |
Cluster, Indoor Localization, Positioning Technique, RSS, WiFi, WLAN |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
DEVELOPING E-ICT COURSES SPECIFICALLY FOR HEARING-IMPAIRED LEARNERS |
Author: |
NORAZAH NORDIN, ROZNIZA ZAHARUDIN, MOHD HANAFI MOHD YASIN, HADI SALEHI, MELOR MD
YUNUS, MOHAMED AMIN EMBI |
Abstract: |
The emergence of Internet and computer science in general has established to
have great roles in human’s life through the World Wide Web. E-learning
education is dependent on the web as an important advance in technology;
therefore, it is crucial for the humans to access web applications. This applies
to the learners with hearing problems as well. This study aims to investigate
the deaf learners’ level of interest and satisfaction towards a developed
e-learning platform named e-HearMe (http://www.e-hearme.net), which offers e-ICT
courses specially designed for deaf learners. To achieve the aim of the study,
an interview protocol was used as a research instrument to interview teachers
who were teaching in schools offering Hearing-Impaired Education Program. In
total, 30 teachers were randomly selected from three schools which were
executing the Hearing-Impaired Education Program in the whole Malaysia. The
results of the interviews showed that the teachers were really satisfied with
the existence of e-HearMe platform as a medium which offers e-ICT courses
specifically developed for the deaf learners. The interviewed teachers also
stated that they could share and exchange their knowledge with other teachers
and students from all over Malaysia, regardless from educational or social
purposes. |
Keywords: |
e-HearMe, ICT, E-Learning, e-ICT Courses, Deaf Learners, Hearing-Impaired
Education Program Malaysia |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
MULTIPLE MOBILE ANCHORS BASED LOCALIZATION USING PARTICLE SWARM OPTIMIZATION (PSO)
FOR WIRELESS SENSOR NETWORKS |
Author: |
S.KAVITHA, Dr.J.KANAKARAJ |
Abstract: |
In Wireless Sensor Network (WSN), though several works have been done on
localization using mobile anchors, they cause huge delay in localization of the
network since the mobile anchor has to cover the entire network. It will be
difficult to provide maximum coverage to the entire network, without considering
the visiting schedule of the mobile anchor node. In this paper, multiple mobile
anchor based localization technique using Particle Swarm optimization (PSO) is
proposed. PSO is used to determine the trajectory of the mobile anchor nodes
which is based upon the node density and the distance between the nodes in the
network. The mobile anchor nodes broadcast packets to the visited sensor nodes
depending on the PSO visiting schedule. The non-localized nodes on receiving the
packets, calculate the estimated distance between each of the mobile anchors and
using trilateration method, they are localized. From the simulation results, it
is proved that localization delay and energy consumption are reduced with
increased packet delivery ratio, when compared to the existing approach. |
Keywords: |
Wireless Sensor Network (WSN), Particle Swarm Optimization (PSO), Mobile
Anchors, Localization |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
Title: |
AN ALGORITHM FOR FAIRNESS BETWEEN SECONDARY USERS IN COGNITIVE RADIO NETWORK |
Author: |
ABDELLAH IDRISSI, SAID LAKHAL |
Abstract: |
Scheduling Secondary Users (SUs), to exploit the free channels in Cognitive
Radio Network, represents one of the major challenges. In this work we propose
an algorithm to ensure fairness of service between the secondary users. This
fairness is expressed in terms of transfer rates. The algorithm produces a chain
containing the order wherein the packets of each SU will be sent. The
experimental results provide transfer rates too close to their average. This
proves the efficiency of scheduler proposed. |
Keywords: |
Cognitive Radio, Scheduling, Fairness, Standard Deviation |
Source: |
Journal of Theoretical and Applied Information Technology
December 2013 -- Vol. 58. No. 1 -- 2013 |
Full
Text |
|
|
|