|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
February 2014 | Vol. 60 No.3 |
Title: |
THE MIXTURE MODEL: COMBINING LEAST SQUARE METHOD AND DENSITY BASED CLASS
BOOST ALGORITHM IN PRODUCING MISSING DATA AND BETTER MODELS |
Author: |
LADAN MALAZIZI |
Abstract: |
The problem of missing values in data tables arises in almost all domains. With
the volume of the information growing every day on the communication channels
and the necessity of the integration of this data for data analysis and data
mining, this reflects even more. In this paper with two step process first we
recover missing values using Least Square method (LS) [1] then we use our own
Density Based Class Boost Algorithm (DCBA) [2] in order to improve learner
performance. In this process we model the data using Meta learner once when data
tables have been cleaned (removing empty rows containing missing values), then
when data has been recovered and lastly after application of The Mixture Model.
In this paper our contributions are toward three issues: first the effect of
data cleaning in the mean of losing data with missing cells in model
performance, second the effect of Least Square method in data generation in such
highly correlated features datasets and third the effect of the combination
model in classifier performance. |
Keywords: |
Least Square, Density Based Class Boost, Missing Data Recovery |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
DIRECT TORQUE CONTROL TECHNIQUE IN INDUCTION MOTOR DRIVES - A REVIEW |
Author: |
S ALLIRANI, V JAGANNATHAN |
Abstract: |
The aim of this paper is to review the origin and developments of direct torque
control (DTC), an advanced control technique of induction motor drives yielding
superior performance. The direct torque control is one of the excellent control
strategies available for torque control of induction machine. It is considered
as an alternative to field oriented control (FOC) technique. The DTC is
characterized by the absence of PI regulators, co-ordinate transformations,
current regulators and pulse width modulated signal generators. DTC also allows
a good torque control in steady state and transient operating conditions. In
this paper, development of DTC is discussed. The DTC based on space vector
modulation (SVM) and switching table has been reviewed. With successively
improving reliability and performance of digital technologies, digital control
techniques have predominated over analog techniques. Digital control techniques
are carried out with microcontrollers, digital signal processors due to their
software flexibility and low cost which are being reviewed. Intelligent control
techniques like neural networks (NN) and fuzzy logic based DTC are reviewed.
Field Programmable Gate Arrays (FPGAs) are a useful platform for the
implementation of high bandwidth control systems and the role of FPGA on DTC
based induction motor drive is also presented. |
Keywords: |
Direct Torque Control, Field Programmable Gate Array, Fuzzy Logic, Induction
Motor, Space Vector Modulation, Neural Network |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
IMPLEMENTING DATA WAREHOUSE AS A FOUNDATION FOR DECISION SUPPORT SYSTEM
(PERSPECTIVE : TECHNICAL AND NONTECHNICAL FACTORS) |
Author: |
Tanty Oktavia |
Abstract: |
A company needs a system to conduct the process of analyzing data in order to
support decision maker takes the best decision. Data warehousing system as one
solution can accommodate that need. The system has been accepted as a key which
enables company or organizations to improve their abilities in the data
analysis, the decision support for managerial, and the automatic extraction of
knowledge. With the developing of information included in the decision making
process, the considered data become more complex, in both structure and
semantics. The aim of this research is to identify the classification technical
and non-technical factors on data warehouse implementation to help the
management decide what to be prepared for the company if they want to develop
data warehouse project. It also explores on how to integrate the data warehouse
system into strategic business process. The study results show that the
successful implementation of data warehouse influenced by how the company aware
about technical and non-technical factor when implemented the project. |
Keywords: |
Data Analysis, Decision Support, Technical, Non-Technical |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
LIVER SEGMENTATION FROM ABDOMEN CT IMAGES WITH BAYESIAN MODEL |
Author: |
NIDAA ALDEEK, RAJA S. ALOMARI, M B AL-ZOUBI, HAZEM HIARY |
Abstract: |
Liver segmentation from CT volumes has been a challenging problem due to the
high inter-organs intensity similarity, the intra-liver intensity variability,
and the partial volume effect. In this paper, we perform an extensive review of
the liver segmentation literature from CT and MRI. Furthermore, we propose a
Bayesian model for a robust and reproducible semi-automatic technique for liver
segmentation from CT volumes. We train our model and validate it using 44
clinical volumes for patients with various types of liver abnormality including
tumor. Our segmentation results show a robust and clinically acceptable liver
volume for all the 44 clinical cases we have with average area overlap accuracy
over 87%. Our method is superior to all state of the art methods that has only
been validated on less number of subjects as we show during the literature
survey. |
Keywords: |
Liver Segmentation, Bayesian Model, Computed Tomography, Abdomen |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
WIRELESS VIDEO TRANSMISSION OVER UWB CHANNEL USING FUZZY BASED RATE CONTROL
TECHNIQUE |
Author: |
S.GNANAVEL, S.RAMAKRISHNAN, N.MOHANKUMAR |
Abstract: |
Communication has always been on the rise especially when it comes to the
transmission of video signals. The problems faced by video transmission include
consumption of large bandwidth and video quality at the receiving side. This can
be overcome by having a propervideo coding and proper control over the rate of
transmission based on channel conditions. In our paper, we focus on these areas
and present transmission control technique for wireless video transmission over
UWB channel. In our technique, the major emphasis lies on the H.264 encoder and
the use of fuzzy controller. The technique comprises of three modules, namely
transmission module, control module and receiver module.Proposed technique is
implemented using Matlab and the evaluation metrics used are BER and MSE.
Comparative analysis is made by comparing our proposed technique to the
SamiaShimuet al.[1] results. From the results, the net average BER came about
0.034 for the base compared to 0.032 for the proposed technique and net average
MSE came about 117 for the base compared to 48.8 for the proposedtechnique. The
obtained values show the effectiveness of the proposed technique by having lower
MSE and BER values. |
Keywords: |
Transmission Control, Fuzzy Controller, PSNR, MSE, BER, UWB, H.264 Encoder And
Decoder, QPSK. |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
FINANCIAL AND PERFORMANCE TRANSPARENCY ON THE LOCAL GOVERNMENT WEBSITES IN
INDONESIA |
Author: |
DWI MARTANI , DEBBY FITRIASARI , ANNISA |
Abstract: |
E-government in Indonesia started since the issuance of Presidential Instruction
Number 3 Year 2003. Local Governments are expected to provide relevant
information related to their finance and performance to the public through the
media that is easily accessible in order to meet the principles of
accountability and transparency. This study has two objectives, that are 1) to
analyze the development of the implementation of Presidential Instruction Number
3 Year 2003 with a focus on the transparency of financial and performance
information and 2) to analyze the factors that affect the transparency level of
financial and performance information in local government websites. To achieve
these two objectives, we calculate the level of financial and performance
information transparencies based on checklist and analyze the results
descriptively then perform multiple regressions to test the influence of
economic factors and characteristics of local government on the level of
financial and performance information transparency at local government’s
websites in Indonesia. Observation was conducted in mid-May 2013 until the end
of July 2013. There are 429 of the total 491 government websites were
successfully accessed and analyzed further. Based on the checklist made, the
transparency level of financial and performance information just reached 15%.
The most frequently disclosed information related to financial and performance
information is in the form of news and Local Governments in Figures. Whereas the
disclosure of main financial and performance informations that are in the form
of regional budgets (APBD), financial statements, and performance reports are
still below 10%. From the regression test of the 403 local governments, it
confirms that size, level of dependency on the central government, and the local
government society welfare have positive influence on the financial and
performance information transparency level of Indonesia local government level. |
Keywords: |
E-Government, Financial Information, Local Government Website, Performance
Report, Transparency. |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
DECISION SUPPORT SYSTEM FOR PREDICTING THE DEGREE OF A CANCER PATIENT’S
EMPOWERMENT |
Author: |
Abdellah ABOUABDELLAH, Abdelghani CHERKAOUI |
Abstract: |
In this paper, we develop a decision support system (DSS) based on Knowledge
Discovery from Databases (KDD). This system allows the prediction of the
empowerment of a patient with cancer treated by chemotherapy.
The first part of this article presents the process of empowerment of the
patient. The second part explains the principle of decision support systems and
the principle of the Knowledge Discovery from Databases (KDD), inspired by the
data mining method. The third part describes the platform for predicting the
degree of a cancer patient’s empowerment. |
Keywords: |
DSS, KDD, Data Mining, Prediction, Empowerment. |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
S-ARMA MODEL FOR NETWORK TRAFFIC PREDICTION IN WIRELESS SENSOR NETWORKS |
Author: |
S.PERIYANAYAGI , Dr.V.SUMATHY |
Abstract: |
Future network traffic in WSN can be predicted by time series models. The
knowledge of traffic can be used for routing, load balancing and QoS
provisioning. S-ARMA model has been proposed to predict the future traffic in
WSN.The abnormality in traffic is predicted and it indicates the possibility for
Dos attack and it initiates frequency hopping to avoid this. Increase in the
frequency hopping time is identified by S-ARMA model, alerts the network to
avoid the anomaly channel. Effectiveness of this model is been proved to be
efficient in detecting the anomaly channel from the simulation results since the
information about the attackers in the channel can be known using swarm
intelligence (ants). |
Keywords: |
S-ARMA, Network Traffic, Frequency Hopping, Swarm Intelligence |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
A CUSTOMIZABLE SoPC ARCHITECTURE FOR GA BASED ON COMPLETE HARDWARE
EVOLUTION |
Author: |
A. SWARNALATHA, A.P. SHANTHI |
Abstract: |
A Genetic Algorithm (GA) is a computer based search optimization technique that
uses the Darwinian “Theory of Evolution” as a model for finding exact and
approximate solutions. GAs belong to a large family of heuristic algorithms
called Evolutionary algorithms (EA) which are being increasingly utilized for
solving complex optimization and search problems. The large computation time
consumed by a GA implemented in software makes it unsuitable for real time
applications. This hurdle is overcome by shifting the implementation to
hardware, which drastically speeds up the time factor, thus presenting a scope
for real time applications. Major issues to be addressed in hardware
implementation are silicon utilization, scalability, providing flexibility and
reduced computational delays. This work presents a customizable Complete
Hardware Evolution (CHE) based GA architecture. Along with the generic modules
for the genetic operators of the GA, modules named Flush and Replace Memory (FRM),
Memory Module (MM), and a sorting module form the main components of this
architecture. These modules can facilitate System on Programmable Chip (SoPC)
implementation for different applications of GA. The coding is done using
Verilog Hardware Description Language (HDL) and simulated using Xilinx ISE 9.1i
simulator. Each module is separately simulated and synthesized for a Commercial
Off The Shelf (COTS) Field Programmable Gate Array (FPGA). The resource
utilization and the critical path delay of the modules are evaluated and
presented for Xilinx Virtex – 4 FPGA. |
Keywords: |
Genetic Algorithm, Complete Hardware Evolution, FPGA, Sopc, Computational Delay,
Resource Utilization.. |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
REVERSIBLE IMAGE WATERMARKING USING BIORTHOGONAL WAVELET TRANSFORM AND
IMPORTANCE MEASURE MODEL |
Author: |
T.SUJATHA, K.GEETHA |
Abstract: |
Image Watermarking is a technique by which a person can attach hidden data into
the image by use of embedding and extraction process. In reversible
watermarking, after the extraction of the watermark, the cover image can be
restored completely. Reversible image watermarking algorithm is proposed in this
paper. Initially, the image is resized and applied bi-orthogonal discrete
wavelet transform to decompose into bands. The best band suitable for embedding
is found with the help of entropy. The best locations in the selected band are
found using the importance measure model. Subsequently, the proposed embedding
process is carried out to embed the image with a watermark image which is
binary. The proposed extraction procedure is carried out after finding out best
band and location. The evaluation metric used to evaluate the proposed
watermarking technique is carried out with the use of PSNR and NC. The proposed
technique obtained good results having an average PSNR value of 31.3 and NC
value of 1. The robustness of the proposed watermarking technique is evaluated
with the aid of different filtering techniques and good results were achieved in
all cases. Comparison is also made with existing technique and from the results;
we can see that our technique outperforms other technique by having better PSNR
and NC values. |
Keywords: |
Reversible Watermarking, Biorthogonal DWT, Entropy, Robustness, Importance
Measure, PSNR, NC |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
HANDWRITTEN DEVNAGARI DIGIT RECOGNITION: BENCHMARKING ON NEW DATASET |
Author: |
RAJIV KUMAR, KIRAN KUMAR RAVULAKOLLU |
Abstract: |
The paper presents handwritten Devnagari digit recognition results for benchmark
studies. To obtain these results, we conducted several experiments on CPAR-2012
dataset. In these experiments, we used features ranging from the simple most
features (direct pixel values), slightly computationally expensive, profile
based features, to more complex gradient features extracted using Kirsch and
wavelet transforms. Using these features we have measured recognition accuracies
of several classification schemes. Among them the combined gradient and direct
pixel feature using KNN classifier yielded the highest recognition accuracy of
95.2 %. The recognition result was improved to 97.87% by using multi stage
classifier ensemble scheme. The paper also reports on the development of
CPAR-2012 dataset that is being developed for Devnagari optical document
recognition research. Presently, it contains 35,000 (15,000 constrained, 5,000
semi-constrained and 15,000 unconstrained) handwritten numerals, 82,609
handwritten isolated characters, 2,000 unconstrained and 2,000 constrained
pangram text, and 4,000 digitized data collection forms. |
Keywords: |
CPAR-2012 dataset, Devnagari digit recognition, neural network classifier,
majority voting, shape similar digits |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
A PERFORMANCE STUDY OF HARDWARE IMPACT ON FULL VIRTUALIZATION FOR SERVER
CONSOLIDATION IN CLOUD ENVIRONMENT |
Author: |
S.SURESH, M.KANNAN |
Abstract: |
Underutilization of hardware resources has always been a problem in single
workload driven traditional OS environment. To improve resource utilization,
virtualization of multiple VMs and workloads onto the same host with the aid of
Hypervisor has been the recent trend. Its use cases such as server
consolidation, live migration, performance isolation and on-demand server
provisioning make it as a heart part of enterprise application cloud. Cloud is
an on-demand, service provisioning technology, where performance plays a vital
role for user acceptance. There are numerous virtualization technologies are
available from full virtualization to paravirtualization, each has its strength
and weakness. As performance study is an ongoing pursuit, hardware and software
development getting matured day by day, it is desirable to do this sort of
performance study in regular interval that often sheds new light on aspects of a
work not fully explored in the previous publication. Hence, this paper focus
performance behaviours of various full virtualization models such as hosted (VirtualBox)
and bare metal (KVM) virtualization using variety of benchmarks from micro,
macro and application level in the cloud environment. We compare both
virtualization solutions with a base system in terms of application performance,
resource consumption, low-level system metrics like context switch, process
creation, interprocess communication latency and virtualization-specific metrics
like virtualization layer consumption. Experimental results yield that
VirtualBox outperforms KVM in CPU and thread level parallelism and KVM
outperforms in all other cases. Both are very reluctantly accepted for disk
usages comparing with native system. |
Keywords: |
Full Virtualization, VirtualBox, KVM, Server Consolidation, Performance |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
AN IMPROVED OF SPAM E-MAIL CLASSIFICATION MECHANISM USING K-MEANS
CLUSTERING |
Author: |
NADIR OMER FADL ELSSIED, OTHMAN IBRAHIM, WAHEEB ABU-ULBEH |
Abstract: |
Spam e-mails are considered as a serious violation of privacy. In addition, it
has become costly and unwanted communication. Although, Support Vector Machine (SVM)
has been widely used in e-mail spam detection, yet the problem of dealing with
huge data is time and memory consuming and low accuracy. This study speeds up
the computational time of SVM classifiers by reducing the number of support
vectors. This is done by the K-means SVM (KSVM) algorithm proposed in this work.
Furthermore, this paper proposes a mechanism for e-mail spam detection based on
hybrid of SVM and K-means clustering and requires one more input parameter to be
determined: the number of clusters. The experiment of the proposed mechanism was
carried out using spambase standard dataset to evaluate the feasibility of the
proposed method. The result of this hybrid method led to improved SVM classifier
by reducing support vectors, increasing the accuracy and decreasing the time of
e-mail spam detection. Experimental results on spambase datasets showed that the
improved SVM (KSVM) significantly outperforms SVM and many other recent spam
detection methods in terms of classification accuracy (effectiveness) and time
consuming (efficiency). |
Keywords: |
K-means clustering, Machine Learning, Spam detection, SVM. |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
A NOVEL LOW COMPLEXITY DIAGONAL REDUCTION ALGORITHM OF LATTICE REDUCTION
FOR SIGNAL DETECTION IN MIMO RECEIVER |
Author: |
Prof. G. RANJITHAM, Dr. K. R. SHANKAR KUMAR |
Abstract: |
This Paper analyses the performance of Multiple Input Multiple Output (MIMO)
wireless communication system by combining it with an efficient algorithm
Diagonal Reduction (DR). DR is a powerful tool to achieve more efficiently a
high performance with less complexity when applied to MIMO detection. This paper
proposes a DR algorithm in order to reduce its complexity named Greedy Diagonal
Reduction (GDR) algorithm which gives reduced complexity with efficient
performance at the receiver using the MATLAB communication tool box version 7.1
as a simulation tool. From the simulation results of various reduction
algorithms it is observed that,DR can reduce the number of iterations using size
reduction operations. Proposed DR algorithm gives identical Bit Error Rate (BER)
performance with LLL algorithms when applied to Successive Interference
cancellation (SIC) decoding. Greedy DR reduces the computational complexity in
Multiple Input Multiple Output systems by improving the efficiency in terms of
size reduction operations. |
Keywords: |
Mimo, Diagonal Reduction (Dr), Low Complexity, Greedy Diagonal Reduction (Gdr),
Bit Error Rate (Ber) |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
CHALLENGES IN MINIMISING ENERGY CONSUMPTION FOR WIRELESS SENSOR NETWORKS |
Author: |
SUDHEER PEMBUSANI, ABHISHEK GUDIPALLI, SARAVANAN MAHADEVAN |
Abstract: |
Wireless sensor networks is one of the active research area now a days due to
its vast applications in different fields such as defence , civilian and medical
fields. One of the basic challenges in the design of Wireless Sensor Network (WSN)
is maximizing their lifetime because of the sensors placement in remote places
which is having batteries as power sources. To extend the WSN lifetime, energy
management is the most important and critical aspect, for that we need different
techniques in different aspects of WSN. This paper presents some methods which
will be useful to minimize energy consumption in sensor networks and to increase
lifetime of the WSN. Sensor nodes are using batteries as their power sources,
effective and efficient utilization of these power sources is essential in order
to use sensor networks for longer period hence it is required to reduce data
transfer rate inside sensor networks, reduce amount of data that required to
send to base station. For this, data aggregation methods are useful for
aggregating data in an effective energy efficient manner so that network
lifetime will enhance. In most applications once WSN deployed, it should
continue to work for a long period of time, without the maintenance of the nodes
and the replacement of their energy sources. Each sensor node in the network
consumes power in different stages like sensing data, processing data and
transmitting/receiving. In all stages minimizing energy consumption is required.
Therefore routing protocols designed which should minimize power consumption in
every stage of WSN because of its effective function. |
Keywords: |
Wireless Sensor Networks, Data Aggregation, Routing Protocols, ESPDA Protocol,
Energy Consumption |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
AUTOMATIC MOTIF DETECTION FOR ISLAMIC GEOMETRICAL PATTERNS |
Author: |
ABDELBAR NASRI, RACHID BENSLIMANE |
Abstract: |
In this paper, we present a new method to detect the basic unit cell of a
periodic Islamic Geometrical Pattern (IGP). This method is based on the
autocorrelation function (ACF), a function known to be appropriate to analyze
and extract a repetitive motif from a regular texture. The motif can be
successfully extracted when the peaks detected in the autocorrelation function
of an image are pertinent. To optimize the peaks detection, we propose a new
method based on the stability of the motif surface which is defined by two short
displacement vectors. Compared to classical extraction methods of periodicity,
the proposed method hereby is tolerant to geometric distortion, noise and
changes in intensity. Tests on 166 images with different visual quality
demonstrate the capability of the proposed method to extract the periodic motif
automatically without the need of human intervention. |
Keywords: |
Autocorrelation, Wallpaper Groups, Islamic Art, Pattern Extraction, Displacement
Vector. |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
A FULLY ASSOCIATIVE CACHE ALGORITHM WITH IMPROVED PERFORMANCE |
Author: |
S.SUBHA |
Abstract: |
Fully associative caches enable all blocks to map address. This paper proposes
an algorithm to enable one fully associative block. The address a is right
shifted by certain prefixed number of bits. The result is XOR’ed with n-1, n
being the number of fully associative cache blocks. The bit selection of the
result to map to one of the n blocks in fully associative cache is performed.
The chosen block is enabled to access a line or place cache line. The proposed
algorithm is simulated with SPEC2K benchmarks. The average memory access time is
improved by 16% with energy savings of 9%. The proposed model shows 30%
degradation in average memory access time with no change in energy consumption
over direct mapped cache of same size. |
Keywords: |
Average Memory Access Time, Cache Algorithm, Cache Performance, Energy Saving,
Fully Associative Cache |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
ANALYSIS OF TRAFFIC FLOW IN CONGESTED CITIES USING CELLULAR AUTOMATA |
Author: |
S.RAJESWARAN AND S.RAJASEKARAN |
Abstract: |
The present study is to model heterogeneous traffic at a congested place in
Chennai using Cellular Automata (CA) and traffic simulator VISSIM. Vehicle type,
its volume, density, average velocity, dimensions of study area and signal cycle
length are used as input parameters for computer simulator VISSIM to find the
dynamics of vehicular traffic. The output obtained substantiates that there will
be decrease in delay time and increase in maximum achievable velocity when there
is reduction in 2W (Two Wheeler) population. In fact, if no 2W is allowed then
the delay time will be reduced to 70.70% and the maximum achievable velocity
will be increased to 34.33%. Using CA rules, simulation was carried out on the
traffic flows in roads of length 100 (small system) and 1000(large system). The
results obtained show that small systems behave differently from long ones and
the traffic reaches a maximum of 420 when the density is 0.1 for small systems
and 323 when the density is 0.09 for large system. |
Keywords: |
Cellular Automata, Heterogeneous Traffic, Vehicular Traffic, Single-Lane Traffic
Flow, Simulation. |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
IDENTIFICATION OF INSTABILITY AND ITS ENHANCEMENT THROUGH THE OPTIMAL
PLACEMENT OF FACTS USING L-INDEX METHOD |
Author: |
K.VENKATA RAMANA REDDY, Dr.M.PADMA LALITHA |
Abstract: |
In this paper an IEEE standard test system is considered and it is tested using
Newton- Raphson method with the help of MATLAB. The voltage magnitudes of each
bus are examined and the corresponding weak bus is incorporated with FACTS such
as SVC and TCSC. The optimal placement of FACTS can be identified using L-Index
method. The value of L-index which approach unity implies that it reaches to
instability. From this instability point the system stability is improved during
steady state and Fault conditions. The disturbance is created in the system by
changing the Load Reactive Power at a particular Bus. |
Keywords: |
Steady State Condition, Fault condition, SVC, TCSC, MATLAB. |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
BSFD: BACKGROUND SUBTRACTION FRAME DIFFERENCE ALGORITHM FOR MOVING OBJECT
DETECTION AND EXTRACTION |
Author: |
D STALIN ALEX, Dr. AMITABH WAHI |
Abstract: |
Advantages and drawbacks of two common algorithms often employed in the moving
target detection, background subtraction technique and frame distinction
methodology are analyzed and compared during this paper. Then supported the
background subtraction methodology, a BFSD target detection rule is projected.
The background image used to process the next frame image is generated through
superposition of the current frame image and therefore the current background
image with certain chance. This algorithm makes the objects that keep
long-standing, however not be detected as a part of the background. The
experimental results show that this algorithm can detect moving objects a lot of
effectively and precisely. |
Keywords: |
Background Subtraction –Frame Difference –Moving Object Detection –Dynamic
Background |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
SIMPLIFIED SCHEME FOR PERFORMANCE AUGMENTATION OF WEB DATA EXTRACTION |
Author: |
G.NAVEENSUNDAR, D.NARMADHA, DR.A.P.HARAN |
Abstract: |
Web mining is the application of data mining techniques to automatically
discover and extract information from Web data. Furthermore, it uses the data
mining techniques to make the web more profitable and to enhance the
effectiveness of our interaction with the web. Users always expect maximum
accurate results from search engines. But, unfortunately most of the web pages
contain more unnecessary information than actual contents. The unnecessary
information present in web pages is termed as templates. Template leads to poor
performance of search engines due to the retrieval of non-contents for users.
Therefore the performance of search engines can be improved by making web pages
free of templates. Our method focuses on detecting and extracting templates from
web pages that are heterogeneous in nature by means of an algorithm. Locality
sensitive hashing algorithm finds the similarity between the input web documents
and provides good performance compared to Minimum Description Length(MDL)
principle and hash cluster process in terms of execution time. |
Keywords: |
Cluster, Non-Content Path, Template Detection, Locality Sensitive Hash, Minimum
Description Length |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
SKYLINE IN CLOUD COMPUTING |
Author: |
ABDELLAH IDRISSI, MANAR ABOUREZQ |
Abstract: |
The cloud computing technology is booming. The utility of this technology is no
longer to show. In this paper, we investigate the problem of search and
selection systems allowing users to search through Cloud services and find the
ones that best meet their needs. In this context, we propose a new algorithm to
address this problem. This algorithm is based on the principle of the Skyline.
One of the main contributions of our work is the construction of a Web Agent
using the Skyline method to determine which Cloud services best meet users’
requirements. In this work, we expose our algorithm and present some
experimental results showing that our approach is very promising. |
Keywords: |
Cloud Computing, Cloud Services, Skyline, Block-Nested Loops Algorithm. |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
Title: |
TOWARDS POPULARITY AWARE HYBRID CACHING TO IMPROVE SEARCH IN SOCIAL
RELATIONSHIP BASED P2P NETWORKS |
Author: |
G. RAMACHANDRAN, DR. K. SELVAKUMAR |
Abstract: |
The widespread use of recent Peer-to-peer (P2P) file sharing has been mainly
influenced by the scalability of their architecture and high versatile search
mechanisms. However, most of the P2P networks construct loosely coupled overlay
on the top of the internet based on the physical network constraints without
taking user preferences or relationship into account. It leads to high
inefficiency in their search algorithms, which mainly relies on the simple
flooding or random walk strategies. In this paper, we present the architecture
of adaptable fully decentralized social based P2P overlay as well as efficient
cognitive community P2P search technique to improve the searching efficiency.
This paper proposes a Dynamic Overlay Adaptation (DOA) algorithm, which creates
social communities by connecting group of peers having similar interest within
the shortest path based on the social relationship between them. The basic
premise of the creation of social communities is that generated queries are most
probably satisfied within its own community. Therefore, it significantly
improves the searching efficiency with a higher success rate and less response
time. In addition to, this paper proposes a Popularity aware Hybrid Caching (PaHC)
which caches the file based on its size to improve cache performance. To
accomplish high data availability, popularity aware strong data consistency is
proposed that eliminates the duplicate content replication among the peers.
Experimental evaluation reveals the effectiveness and efficiency of the proposed
approach in terms of success rate, network traffic, response time and cache hit
ratio. |
Keywords: |
Unstructured P2P Networks, Peer Search, Social based P2P Overlay, Popularity
Aware Hybrid Caching |
Source: |
Journal of Theoretical and Applied Information Technology
28 February 2014 -- Vol. 60. No. 3 -- 2014 |
Full
Text |
|
|
|