|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
October 2014 | Vol. 68 No.1 |
Title: |
CLUSTER HEAD BASED GROUP KEY MANAGEMENT FOR MALICIOUS WIRELESS NETWORKS USING
TRUST METRICS |
Author: |
V.BHUVANESWARI, Dr. M.CHANDRASEKARAN |
Abstract: |
The process of transferring messages from a member to another member securely
within a network is known as secure group communication. Key management is an
important primitive to ensure this, as it provides a secure method for
cryptographic keys creation, distribution and management. Group key
establishment/management methods are key management’s two sides. Group members
use group key (GK) for encryption/decryption of messages in group communication.
Communication needs quality and security for better performance and for
acceptance of users and client companies. Diffie-Hellman (DH) was the first
published public key algorithm that is used for secure key exchange mechanism.
The purpose of algorithm is used to enable users to security exchange a key that
can be used for subsequent encryption. Earlier schemes used only one group
controller and were thus affected by single point failure (1-affects-n problem).
To prevent this, a new technique where a control group generates the group key
is introduced based on the nodes capability within two hops. In this scheme,
direct trust and indirect trust is computed to identify Cluster Heads (CH) and
the concept of auxiliary cluster head is introduced for effective key
management. |
Keywords: |
Mobile ad hoc networks (MANETs), Dynamic Source Routing (DSR), Malicious Nodes,
Clustering, Key Management |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
A CROSS LAYER SCHEDULING SCHEME FOR WIMAX |
Author: |
CHANDRASEKARAN V , NAGARAJAN N |
Abstract: |
In this paper, we propose a novel scheduling scheme that will provide
quality-of-service (QoS) support in the downlink (DL) of a WiMAX system. The
proposed Subchannel mapping Scheduling Algorithm completely exploits the
physical layer properties of the Orthogonal Frequency Division Multiple Access (OFDMA)
in order to improve the throughput of the WiMAX network at the same time
improving the QoS of the various services guaranteed by the network. The
proposed scheduling scheme has three parts. The first stage of the scheduling
shame is the optimized queuing model that will carry out effective mapping
between the admitted service flows and the packet arrival statistics. The
performance of the algorithm is improved further by the enhancement scheme which
would further improve the performance of the proposed scheme through fuzzy logic
design. At the last stage we have the slot allocation process which completes
the process. The cross -layer approach followed in the scheduling scheme
provides effective slot allocation under various channel conditions. It also
offers more options for the network to cater delay- sensitive multimedia traffic
by reducing the delay in the network. Simulation results have demonstrated that
the proposed scheduling scheme improves the throughput and also reduces the
delay of the WiMAX network. |
Keywords: |
Mobile Wimax, Orthogonal Frequency Division Multiple Access (OFDMA), Scheduling,
Quality-Of-Service. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
EFFECTIVE FEATURE EXTRACTION FOR DOCUMENT CLUSTERING TO ENHANCE SEARCH ENGINE
USING XML |
Author: |
P.AJITHA, DR. G. GUNASEKARAN |
Abstract: |
Clustering is the task of grouping a set of objects in such a way that objects
in the same group are more similar to each other than to those in other groups.
Clustering is done using lingo algorithm by extracting the data contents in the
document. The data is stored in XML, which manages large volume of data. Lingo
combines several existing methods to put special emphasis on meaningful cluster
descriptions, apart from identifying document similarities. The steps involved
in this process are designing the term-document matrix and then extracting the
frequent phrase using suffix arrays. Readable and unambiguous descriptions of
the thematic groups are an important factor of the overall quality of
clustering. The Lingo algorithm consist of five phases, they are Pre-processing,
Extraction of Frequent phrase, Induction of Cluster label, Discovery of Cluster
content, Final cluster formation. |
Keywords: |
Lingo, XML, SVD, LSI, Phrase matrix. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
VLSI DESIGN OF 12-BIT ADC WITH 1GSPS IN 180NM CMOS INTEGRATING WITH SAR AND
TWO-STEP FLASH ADC |
Author: |
K.LOKESH KRISHNA, T.RAMASHRI |
Abstract: |
In this paper, a Novel Hybrid ADC consisting of two-step quantizer which has
Flash ADC and SAR ADC along with Resistor String DAC is designed and
implemented. This Hybrid ADC improves the speed by employing Flash ADC and
resolution and power reduction can be achieved by utilizing SAR ADC. The Hybrid
architecture carrying 12 bits as resolution, input frequency as 100MHz and
sampling frequency is 1GHz. CMOS level schematic diagram of sub-blocks has been
designed and implemented using Cadence Virtuoso 180nm technology at an operating
voltage of 1.8 V. Layout design is captured using Virtuoso and then it is
optimized for area. Based on the obtained result, INL and DNL has been
identified as +0.034V to – 0.001V and + 0.06V to -0.05V respectively. The
performance results show that the architecture achieves low power, high speed
and less area. |
Keywords: |
Sample and Hold Circuit, Comparator, Transmission Gate, Hybrid ADC, Two-step
Quantizer, SAR and Flash ADC. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
MEDICAL DIAGNOSIS SYSTEM FOR THE DIABETES MELLITUS BY USING BACK PROPAGATION-APRIORI
ALGORITHMS |
Author: |
K. SRIDAR M.E., Dr. D. SHANTHI |
Abstract: |
Diabetes is a chronic disease and a major public health challenge worldwide.
According to the International Diabetes Federation, there are currently 246
million diabetic people worldwide, and this number is expected to rise to 430
million by 2030. Furthermore, 3.8 million deaths are attributable to diabetes
complications each year. It has been shown that 80% of type 2 diabetes
complications can be prevented or delayed by early identification of people at
risk. Early detection of diabetes would be of great value given the fact that at
least 50% and 80% in some countries, of all people with diabetes are unaware of
their condition and will remain unaware until complications appear. Several data
mining and machine learning methods have been used for the diagnosis, prognosis,
and management of diabetes. In my research work, clinical data are collected
based on the attributes downloaded from Pima Indians Diabetes Database. Real
time input have been given to the system from the Glucometer (ie Glucose level
for the patient before breakfast and Glucose level for the patient two hours
after breakfast) then some on the input attributes given manually to the system.
All the inputs have been given to the BP Algorithm (ANN) and Apriori Algorithm
(ARM) for diagnosing diabetes. The Diagnosis system can be implemented in Java,
Dotnet. The output of the system shows how much percentage of the patient
suffers from the diabetes (low, medium or high risks)? In proposed work it is
decided to implement the system in Online. The main objective is that the
patients can know about their risk in diabetes without the help of doctors. The
patient can just login into the website and they can give their attributes (ie)
data collected from labs as input and they can diagnosis their own result
without doctors. |
Keywords: |
Artificial Neural Network (ANN), Back Propagation Algorithm, Association Rule
Mining (ARM), Apriori Algorithm, Disease Diagnosis. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
SOFTWARE RELEASE PLANNING- A MODEL INCORPORATING ENVIRONMENTAL PARAMETERS |
Author: |
SANDHIA VALSALA, DR. ANIL R NAIR |
Abstract: |
A software release planning can be seen from two dimensions “what to release”
and “when to release”. The most crucial decision is whether or not to select a
feature for implementation in the next software release. A number of software
release planning models are available which considers a wide variety of factors
in deciding the implementation of a feature in a release.This paper analyzes 31
release planning models and the selection factors used by these models. Most of
these models use only in-project parameters in deciding on the features to be
included in a release. A new release planning model incorporating a group of
“environmental factors” ,which plays a crucial role in deciding the priority of
features to be included in each release is then proposed .The paper emphasize
the need to include environmental parameters which are parameters not directly
linked to project, but influences the project from outside in planning a
release. |
Keywords: |
Software Release, Release Planning, Environmental Parameters, In-Project
Parameters, Feature Priority |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
AN EFFICIENT MULTI-PATH ROUTING ALGORITHM BASED ON HYBRID ARTIFICIAL BEE COLONY
ALGORITHM FOR WIRELESS MESH NETWORKS |
Author: |
K.KUMARAVEL and Dr.A.MARIMUTHU |
Abstract: |
The number of services obtainable by wireless network has been improved for
recent years. It motivates to the improvement of new wireless technologies. New
technologies are required to satisfy the requirements or necessitated of the
users regarding wireless services. The newly developed wireless technology with
mesh topology is known as wireless mesh networks (WMN). Routing is one of the
most significant problems handled by every WMN technologies during data
transmission .In Wireless Mesh Networks (WMN), the Multi-path routing is one of
the mainly significant problems occurred throughout data transmission process
based on their ability of the link for multiple paths. To overcome the problem
of the multipath routing earlier work presents a Dijkstra’s Algorithm (DA) for
route setup and ant colony based optimization (ACO) algorithm for route
examination. But the algorithm used for route examination have lower meeting
time in ACO and worst-case running time in DA for route setup .In order to solve
the above mentioned issues in this work presents an Filter-Kruskal algorithm for
first route setup from source to destination path . The multiple routes path are
examined and maintained based on the hybrid artificial bee colony (HABC)
algorithm. The routes explored for data transmission are chosen based on their
expected honey bee values. If any failure occurs in this HBAC route examination
step it is forwarded to route maintenance system. Proposed FKAWMNet efficiently
solves multipath routing problem in wireless mesh networks (WMN). The simulation
results of the proposed FKAWMNet show that proposed work achieves a higher
packet delivery ratio, lesser end to end delay and lesser routing overhead than
the existing routing protocols such as DAWMNet and AntHocNet. So the proposed
FKAWMNet achieves highly reliable communication, assurance of load balancing and
easily applicable to topological changes without node failure. |
Keywords: |
Multi-path routing, Industrial wireless mesh networks, Filter-Kruskal algorithm
(FKA), Hybrid Artificial Bee Colony (HABC) |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
AUTOMATIC SEGMENTATION OF LUNG CT IMAGES BY CC BASED REGION GROWING |
Author: |
A.PRABIN , DR. J.VEERAPPAN |
Abstract: |
Computer Aided Diagnosis (CAD) of CT lung image has been a revolutionary step in
the early diagnosis of diseases present in the lung. Developing an efficient and
robust algorithm for Lung computer tomography (CT) image segmentation has been a
demanding area of growing research of interest during the last two decades. The
initial step in computer aided diagnosis of lung CT image is generally to
segment the Region of Interest (ROI) present in it and then to analyze each area
separately inorder to find the presence of pathologies present in it. This
research reports on segmentation of the ROI by segmenting the CT lung images
using supervised contextual clustering along with the combination of region
growing algorithm. Region growing has been combined with CC in this work since
it reduces the number of steps in segmentation for the process of identifying a
tissue in the CT lung image. The performance of this proposed segmentation is
proved to be better when it is compared with other existing conventional
segmentation algorithms like ‘Sobel’, ‘Prewitt’, ‘Robertz’, ‘Log’, ‘Zerocross’.
From the experimental results, it has been observed that the proposed
segmentation approach provides better segmentation accuracy. |
Keywords: |
Contextual Clustering, Segmentation Algorithm, Computer Tomography,
Pulmonary Lung Image. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
CCTF-RQ FRAMEWORK: COMMAND AND CONTROL TRAFFIC FLOW WITH RESOURCE QUEUING IN
CLOUD INFRASTRUCTURE |
Author: |
DURAIRAJ, CHANDRASEKAR |
Abstract: |
Cloud infrastructure interact with different components ensuring traffic free
environment for both the simple and complex traffic and provide services to
different type of vendors. Most previous studies conducted using prediction
based resource management aim at providing resources to the user using
statistical model that measure the traffic flow required for the future. However
in practice it is not effective to provide appropriate resources using
predictive analysis as it does not take into account the complex traffic
pattern. In addition, prediction framework in cloud zone is much less compact
for the business persons as it does not follow the dynamic flow generation part.
Methods for controlling the traffic in cloud using virtualization based cloud
data centers called as VMPlanner used the stepwise optimization approach. The
approach, VMPlanner failed to incorporate certain level of links redundancy,
where the communication latency was achieved during the data flow, but resulted
in network failures. In this paper, a framework called Command and Control
Traffic Flow with Resource Queuing (CCTF-RQ) is presented to facilitate business
persons eventually with dynamic data flow in cloud zone. The framework, CCTF-RQ
works well by obtaining the prior knowledge about the server and the client
system on the cloud for easy analyze of flow correlations. The proposed
framework, CCTF-RQ clusters the similar configuration of server and client
systems on the cloud and significantly solves the complex traffic occurring on
user communication patterns. Further, in cloud environment certain level of link
redundancy may deteriorate the network. We address the problem by clustering
exactly the similar command control traffic patterns in cloud zone followed by
which the session gets established to access the network without any
communication latency. An efficient algorithm called Banker’s Safety algorithm
is developed to overcome the network failures during data flow. CCTF-RQ
Framework on cloud zone is implemented with CloudSim in JAVA and experiment is
conducted on the parametric factors such as Server response time, CPU
utilization for correlation analysis, Mean Absolute Percentage Error, Traffic
rate. |
Keywords: |
Cloud Infrastructure, Command and Control, Resource queuing, Banker’s Safety
algorithm, Server Configuration, Network Resources. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
SIMPLIFIED FIVE LEVEL INVERTER AS SERIES ACTIVE POWER FILTER FOR MINIMIZING THD
IN A NON-LINEAR SYSTEM |
Author: |
C.RAMAKRISHNAN , S.VIJAYAN , K.ARTHI |
Abstract: |
A series hybrid Active Power Filter has been focused more in the area of power
quality. The hybrid filter is implemented with three phase simplified
asymmetrical five level inverter as active filter and LC filter connected in
parallel with power lines acting as passive filter. The harmonics produced by
the non linear loads are filtered out by the passive filters and the filtering
property of it is improved by active filter. The multi level voltages are
obtained from a simplified inverter, which operates at lower switching frequency
intern reduces the switching losses compared to conventional cascaded five level
inverter. The filter is developed to inject voltage in series and also for
harmonic isolation. Current harmonics are eliminated by controlling the APF with
the P-Q theory based control strategy developed for this simplified inverter.
The system produces voltage of same amplitude but opposite to load harmonic
voltage to nullify the harmonics injected into the system. For current
harmonics, the system produces voltage proportional to the current and offers
high impedance path to the current harmonics. The validity of the control scheme
is verified by simulation study with help of MATLAB/SIMULINK and comparative
analysis is carried out with help of simulated values to obtain the harmonic
values within standard values. |
Keywords: |
Active Filters, Harmonic Distortion, PQ theory, Multilevel Converters, Power
Quality |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
FUZZY C-MEANS AND ENTROPY BASED GENE SELECTION BY PRINCIPAL COMPONENT ANALYSIS
IN CANCER CLASSIFICATION |
Author: |
SOMAYEH ABBASI, HAMID MAHMOODIAN |
Abstract: |
Microarray analysis is used in human cancer diagnosis and tumor classification.
However, microarray data often have high dimensionality and small sample size.
Gene selection is a significant preprocessing of the discriminant analysis of
microarray data to select the most informative genes from thousands of genes. In
this paper, a gene selection method proposed for cancer classification in two
stages. First, the initial reduction of data by Fuzzy C-Means clustering (FCM)
and filter method (T-test) is performed for dimensionality reduction. The next
stage is to perform gene selection based on an entropy measure of eigenvalues
from Principal Component Analysis on a leave-one-out basis, which is called PCA-entropy.
Colon cancer, leukemia and lung datasets have been classified based on proposed
gene selection algorithm by SVM and KNN classifiers. In most cases, the results
show a good performance compared to the other recent studies. |
Keywords: |
Classification, Clustering, Entropy, Gene selection, Principal Component
Analysis. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
IMPROVED GENERAL SELF-ORGANIZED TREE-BASED ROUTING PROTOCOL FOR WIRELESS SENSOR
NETWORK |
Author: |
M.SENGALIAPPAN , Dr.A.MARIMUTHU |
Abstract: |
Wireless Sensor Networks (WSNs) is a promising structure used to assist the
stipulation of many military and industrial services. They have many different
constraints, such as computational power, storage capacity, energy supply and
etc are the important issue is their energy constraint. Many issues hold back
the effectiveness of WSNs to support different applications, such as the
resource confines of sensor devices and the finite battery power. To overcome
this problem and to improve the performance need not only to minimize total
energy consumption but also to balance WSN load. In this research, a novel tree
based routing protocol is proposed which builds a routing tree using a process
where, for each round, BS assigns a root node and broadcasts this selection to
all sensor nodes. Subsequently, each node selects its parent by considering only
itself and its neighbors’ information, thus making a dynamic protocol. The
simulation results shows that the proposed approach performs better that other
existing approaches. |
Keywords: |
Keywords: Wireless Sensor Network, Routing, Network Lifetime, Clustering, Tree
Based Routing |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
ADAPTIVE MODULATION IN RECONFIGURABLE PLATFORM |
Author: |
S. SELVAKUMAR, Dr.S.RAVI, M.VINOTH, R.KAMALAKNNAN, V.JAYAPRADHA |
Abstract: |
Autonomous modulation and detection technique in modern communication systems is
done using proper signal detection schemes and prominent receiver structure. The
modulation schemes used in this paper are ASK, FSK, BPSK and QAM. Modulation
techniques are created in Simulink which is converted into Xilinx core and this
further undergoes changes using system generator module. This results in the
generation of Verilog files and is deployed in FPGA. Using a microcontroller the
FPGA was programmed with respective bit files and the modulation that had the
best channel support is selected. The methodology used to identify the best
modulation for a particular link in reconfiguration is called adaptive
modulation process. The interface is done between the reconfiguring controller
(STM32) and the reconfigured FPGA (XCS250-pq144) using JTAG port. The programmed
files were created (XSVF format) and placed in SD-card of microcontroller. The
condition for changing from one modulation to another is based on the link
support and signal position. The HyperTerminal displays the output corresponding
to different modulation selections autonomously. Signal to Noise Ratio (SNR),
Available Bandwidth and Bit Error Rate (BER) are the factors responsible for the
selection of modulation scheme. Hence, better quality of service, system
complexity, power efficiency, bandwidth efficiency and cost effectiveness are
the advantages of Adaptive Modulation technique. |
Keywords: |
BER, FPGA, Microcontroller, Modulations, SNR, and Xilinx. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
PROPOSE AN INTEGRATION BETWEEN UML STATIC AND DYNAMIC MODELS USING
ENTITY-ATTRIBUTE-VALUE UNDER THE MDA CONTEXT |
Author: |
AHMED MOHAMMED ELSAWI, SHAMSUL SAHIBUDDIN, ABDELHAMID ABDELHADI |
Abstract: |
The Model Driven Architecture (MDA) is adopting models to improve the software
productivity, reusability, maintainability and quality by focusing on models and
metamodels in place of conventional code. The MDA separates the technical
details from the business logic in two different models. The Platform
Independent Model (PIM) is concerned with the business logic while the Platform
Specific Model (PSM) is more focusing on the targeted platform. Normally, PIM
and PSM models stand in different level of abstraction. Moving from one level of
abstraction to another is achieved by Model transformation. Both PIM and PSM are
modeled using UML diagrams. The UML supports a variety of diagrams that can be
categorized into static and dynamic diagrams. The static diagrams are normally
targeted the system’s structures and it is commonly used to define the PIM and
PSM models. On the other hand, the dynamic diagrams are targeting the system’s
behavior and its dynamic elements. To successfully develop a complete software
using the MDA methodology, all structural and behavioral elements should be
captured. Hence, different versions of the PIM and PSM models should be employed
to cover the structural and behavioral elements of the system. Consequently,
beside time, cost, and complexity issues a considerable number of model
transformation iterations are required for each version separately. Into face of
these issues, we propose this work to address the integration between UML
behavioral and structural diagrams using the Entity-Attribute-Value (EAV) model.
Also, we presented an example to show how this proposed concept not only
allowing for an integration between UML static and behavioral models, but also
shows the flexibility of integration models in different level of abstraction. |
Keywords: |
UML Models Integration, behavioral models, Static Models, MDA, EAV |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
CLASSIFICATION OF ARM MOVEMENT BASED ON UPPER LIMB MUSCLE SIGNAL FOR
REHABILITATION DEVICE |
Author: |
M.H.JALI, I.M.IBRAHIM, Z.H.BOHARI, M.F.SULAIMA, M.N.M.NASIR |
Abstract: |
Rehabilitation device is used as an exoskeleton for people who experience limb
failure. Arm rehabilitation device may ease the rehabilitation programme for
those who suffer arm dysfunctional. The device used to facilitate the tasks of
the program should improve the electrical activity in the motor unit by
minimising the mental effort of the user. Electromyography (EMG) is the
techniques to analyse the presence of electrical activity in musculoskeletal
systems. The electrical activity in muscles of disable person are failed to
contract the muscle for movements. To prevent the muscles from paralysis becomes
spasticity or flaccid the force of movements has to minimise the mental efforts.
To minimise the used of cerebral strength, analysis on EMG signals from normal
people are conducted before it can be implement in the device. The signals are
collect according to procedure of surface electromyography for non-invasive
assessment of muscles (SENIAM). The implementation of EMG signals is to set the
movements’ pattern of the arm rehabilitation device. The filtered signal further
the process by extracting the features as follows; Standard Deviation(STD), Mean
Absolute Value(MAV), Root Mean Square(RMS), Zero Crossing(ZCS) and Variance(VAR).
The extraction of EMG data is to have the reduced vector in the signal features
for minimising the signals error than can be implement in classifier. The
classification of features is by SOM-Toolbox using MATLAB. The features
extraction of EMG signals is classified into several degree of arm movement
visualize in U- Matrix form. |
Keywords: |
Arm Rehabilitation Device, Electromyography, Self-Organizing Map, Time Domain
Features Extraction, Exoskeleton |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
UPGRADING THE SEMANTICS OF THE RELATIONAL MODEL FOR RICH OWL 2 ONTOLOGY LEARNING |
Author: |
BOUCHRA EL IDRISSI, SALAH BAINA, KARIM BAINA |
Abstract: |
This paper is interested in the ontology learning from relational databases (RDB)
that exploits already approved semantics about a domain and translates them to
application ontology. However, the relational model is recognized to be less
expressive and incapable to support some conceptualizations. Without an explicit
model of the domain semantics in the relational model, the automatic learning of
ontology risks to infer incorrect semantics. In this paper, we give some proof
case studies and we propose a model to upgrade the semantics of the relational
model, before the ontology learning. The paper presents the constructs of the
proposed model and it shows how they are translated to constructs of OWL 2
ontology. |
Keywords: |
Ontology Learning, Relational Database, Semantic Enrichment, OWL 2. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
PERFORMANCE & ANALYSIS OF BLOCKING ARTIFACTS REDUCTION USING GUASSIAN ADAPTIVE
SPATIAL LOW PASS FILTER |
Author: |
Dr.M.ANTO BENNET, R.SRINATH, P.MARAGATHAVALLI, B.SUBHASHINI |
Abstract: |
Ringing artifacts that appear as spurious signal near sharp transitions in a
signal appear as ghosts (or) near edges. Two techniques called LPF and ASLPF is
used. LPF is used for only just identifying the artifacts present in the video
image. ASLPF (Adaptive Spatial Low Pass Filter) is used to identify the
artifacts whether it is present in horizontal (or) vertical position (or) near
edges. DCT is applied to each and every pixel in the block; the spatial domain
is converted into frequency domain. But now a technique called Adaptive Spatial
Low Pass Filter (ASLPF) is used to identify the artifacts and remove it
completely. Video is converted into frames and each frame is compressed for
removing the artifacts. Finally the reduction of artifacts is shown in PSNR. |
Keywords: |
Low Pass Filter, Adaptive Spatial Low Pass Filter, Discrete Cosine
Transform, Peak Signal to Noise Ratio, Blocking Artifacts and Ringing Arifacts. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
DEVELOPING A SAAS-CLOUD INTEGRATED DEVELOPMENT ENVIRONMENT (IDE) FOR C, C++, AND
JAVA |
Author: |
A.B. MUTIARA, R. REFIANTI, B.A WITONO |
Abstract: |
Cloud era brought revolution of computerization world. People could access their
data from anywhere and anytime with different devices. One of the cloud's model
is Software as a Service, which capable to provide applications that run on a
cloud infrastructure.An IDE (Integrated Development Environment) is the most
popular tool to develop application in the network or single computer
development. By installing IDE in each computer of the network could causes the
lot of time and budget spending. The objective of the research is developing an
efficient cloud based IDE. The IDE could compile the code which sent from client
browser through SaaS IDE to the server and send it back to the client. The
method that used in the research is the System Development Life-Cycle: Waterfall
and Unified Model Language as system designing tool. The research successfully
produced the cloud-based SaaS IDE with excellent result from several testing in
local network and internet. |
Keywords: |
Cloud, IDE, SAAS, IDE, UML |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
RECOGNITION OF DIABETIC RETINOPATHY USING BLOOD VESSELS FROM VIDEO SEQUENCES |
Author: |
D.STALIN ALEX, Dr.AMITABH WAHI, Dr.J GEORGE CHELLIN CHANDRAN, A.USHA RUBY,
Dr.R.REVATHI |
Abstract: |
Diabetic Retinopathy (DR) could be a condition occurring in persons with blood
sugar, which causes progressive damage to the retina. At the beginning, Diabetic
Retinopathy may cause no symptoms or mild vision problems. Eventually, however
Diabetic Retinopathy can result in blindness. It is mainly due to the
development of abnormal blood vessels in the retina. In this approach, we
propose an efficient method to detect the blood vessels. The main focus of this
paper is instead of using still images, the blood vessels are extracted from the
video sequences. Contrast enhancement done in color image followed by
thresholding, which helps in the dynamic preservation of the local contrast
characteristics and median filtering carried in order to smoothen the background
noise. The results of the proposed algorithm show a considerable improvement in
the detection of blood vessels. |
Keywords: |
Diabetic Retinopathy (DR), Color Contrast Enhancement (CCE), Smoothening, Video
Processing(VP),Object Extraction(OE). |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
A MMKP BASED HEURISTIC FOR QUALITY ENHANCED SCALABLE VIDEO STREAMING OVER
WIRELESS NETWORKS |
Author: |
SWARNA PARVATHI, RAJU DAS, EASWARAKUMAR, KALAIARASI |
Abstract: |
Scalable Video Coding (SVC) is a very promising encoding technique that adapts
to streaming video over wireless networks with bandwidth fluctuations. This
paper proposes a Bandwidth Aware Layered Streaming Algorithm (BALSA), a MMKP
(Multi-dimensional Multiple-Choice Knapsack Problem) based GHPMH (Gradational
Hull Pareto Minimization Heuristic) to stream scalable video to heterogeneous
users with different client’s display under bandwidth limitations so as to
maximize the average video quality over all the streams. Using extensive
simulation we show that our algorithm finds solutions which are close to the
optimal (within 1 db) under realistic conditions with reduced computational
complexity. |
Keywords: |
MMKP (Multi-dimensional Multiple Choice Knapsack Problem), Scalable Video
Streaming, Pareto Minimization |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
PRIORITIZED DIRECTIONAL BROADCAST TECHNIQUE FOR MESSAGE DISSEMINATION IN VANETS |
Author: |
S. LAKSHMI , Dr. R.S.D.WAHIDA BANU |
Abstract: |
In Vehicular Ad Hoc Network (VANET), there is a possibility of occurrence of
accidents within a cluster, when the driver does not react rapidly. To resolve
this problem, in this paper, we propose a Prioritized Directional Broadcast
Technique for Message Dissemination in VANET. Initially, message priority
assignment technique is used in which three levels of message priorities, that
is, very urgent, urgent and general messages are considered. Binary partition
phase is then performed for finding the candidate relay node inside the coverage
area of the source. Simulation results shows that the proposed approach provides
high reliability during emergency message dissemination. |
Keywords: |
Vehicular Networks, Data Dissemination, Clustering. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
LTE STANDARD: CHANNEL ESTIMATION ALGORITHMS FROM THE BASE STATION TO THE
TERMINAL |
Author: |
ABDELHAMID LARAKI & DRISS EL OUADGHIRI, ABDELLAH JAMAL |
Abstract: |
This report deals with the LTE downlink transmission scheme, from the base
station to the terminal (the mobile phone), which is based on multicarrier
modulation: OFDM (Orthogonal Frequency Division Multiplexing). LTE also supports
the use of multiple antennas at both the base station and the terminal to
improve communication performance: Multiple Input – Multiple Output (MIMO)
antenna processing.
The project focuses on the different methods for estimating the time-varying
channel between the transmitter and the receiver: to carry out coherent
demodulation, the mobile terminal requires estimates of the downlink channel,
and to allow this, known symbols are inserted in the transmitted signal. |
Keywords: |
Analog Baseband - Orthogonal Frequency-Division Multiplexing (OFDM) - End Module
- LTE |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
AN APPROACH FOR IMAGE DENOISING USING NLM IN TERMS OF ANISOTROPIC DIFFUSION |
Author: |
GANESH NAGA SAI PRASAD V , HABIBULLA KHAN , E.GOPINATHAN |
Abstract: |
In a traditional single view photograph, dynamic objects or cameras cause
motion noise. Digital image denoising is a prominent field in signal processing,
focusing on improving the quality of images suffering from various degradation
effects such as noise . To perform the denoising usually requires modelling the
image content in order to separate the true image content from the degradation
effects and restoring the degradation-free content. Restoration of image
sequences can obtain better results compared to restoring each image
individually, provided the temporal redundancy is adequately used. However in
denoising of image sequence, the estimation of motion patterns between the
frames in order to be able to merge the data from various frames are very
complex and as a result motion estimation, a severely under-determined problem,
tends to be error-prone and inaccurate. In this paper, for image denoising we
are suggesting an algorithm which will give better result than the basic NLM
algorithm. |
Keywords: |
Denoising, NLM, Ansitropic Diffusion, Degradation, PSNR. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
OCCURRENCE BASED CATEGORICAL DATA CLUSTERING USING COSINE AND BINARY MATCHING
SIMILARITY MEASURE |
Author: |
S. ANITHA ELAVARASI1, J. AKILANDESWARI |
Abstract: |
Clustering is the process of grouping a set of physical objects into classes of
similar object. Objects in real world consist of both numerical and categorical
data. Categorical data are not analyzed as numerical data because of the absence
of inherit ordering. This paper describes about occurrence based categorical
data clustering (OBCDC) technique based on cosine similarity measure and simple
binary matching similarity measure. The OBCDC system consists of four modules,
such as data pre-processing, similarity matrix generation, cluster formation and
validation. Similarity matrix generation uses three functions, namely
FrequencyComputation, OccurranceBasedCosine and OccurranceBasedSBMS. The time
complexity of various algorithms are discussed and its performance on real world
data are measured using accuracy and error rate |
Keywords: |
Clustering, Unsupervised Learning, Categorical Data, Cosine Similarity, Simple
Binary Matching Similarity |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
EMBEDDED ZERO TREE WAVELET AND ORTHOGONAL POLYNOMIAL BASED TRANSFORM CODING |
Author: |
T.KARTHIKEYAN , B.PRABURAJ , K.KESAVAPANDIAN |
Abstract: |
In this paper, a new and effective mode for transform coding of images depends
on orthogonal polynomials has been recommended. The proposed orthogonal
polynomials be contingent on transform coding system has the encoder, comprising
of a polynomial transform operation tracked by quantization of transform
coefficients and the entropy coding of quantized measurements. Embedded Zerotree
of Wavelet transforms is a lossy image compression algorithm. At low bit rates
most of the coefficients bent by a subband transform will be zero, or very close
to zero. It provides extensive improvement in picture quality at privileged
compression ratio. After the encoded bit stream of an input image is transferred
over the channel, the decoder rears all the functionalities applied in the
encoder and tries to reconstruct a decoded image that looks as close as possible
to the inventive input image. The result of the proposed coding are compared
with Embedded Zero Tree Wavelet and Discrete Wavelet Transform coding. The new
image coding algorithm results in a considerable reduction in computation time
and provides better reconstructed picture quality. The experimental results of
the proposed technique also show that the efficient bit rate decline is
achieved, when equated with existing techniques. |
Keywords: |
Orthogonal Polynomials, Quantized Coefficients, EZW, Lossy Image Compression,
Compression Ratio. |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
Title: |
APPLYING THE FVSI-GENERATION TRACING AND HYBRID ANT COLONY ALGORITHM FOR
EFFECTIVE STATIC GENERATION POWER DISPATCH |
Author: |
Z. HAMID, I. MUSIRIN |
Abstract: |
This paper proposes the latest application of power tracing technique for
voltage stability improvement. Instead of solving losses charge allocation
problem in deregulated power market as performed by many researchers, the
proposed technique can be implemented for effective static generation power
dispatch; that is, by determining suitable generating units involved in
re-dispatching with the aim of providing a more economical and effective power
generation. Using power tracing approach, the determination of suitable
generating units can be done by means of stability index tracing or specifically
termed as Fast Voltage Stability Index-Generation Tracing (FVSI-GT). After
deriving a ranking list of generator buses via FVSI-GT, the real and reactive
power of the selected generators are sized using a new hybrid algorithm; the
Blended Crossover Continuous Ant Colony Optimization (BX-CACO). Simulation and
experiment on IEEE 57-Bus reliability test system (RTS) justified the
reliability of FVSI-GT for precise selection of suitable generators besides
other conventional ranking methods. In addition, the proposed BX-CACO showed a
tremendous performance in terms of convergence speed and solution quality. |
Keywords: |
BX-CACO, FVSI-GT, Generation Power Dispatch, Stability Index Tracing |
Source: |
Journal of Theoretical and Applied Information Technology
10th October 2014 -- Vol. 68. No. 1 -- 2014 |
Full
Text |
|
|
|