|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
January 2014 | Vol. 59 No.3 |
Title: |
ADAPTIVE COLOR FILTER ARRAY INTERPOLATION ALGORITHM BASED ON HUE TRANSITION AND
EDGE DIRECTION |
Author: |
E. SREE DEVI, Dr. B. ANAND |
Abstract: |
Most of the digital cameras used now has single sensor with an array of
color filters to capture the digital images. Color Filter Array (CFA) uses
alternating color filters to sample only one color of a pixel location to reduce
the price. In the entire image, only one third of sampled pixels have genuine
values from the sensors. Remaining pixels are interpolated from these genuine
values. In this work, a new adaptive CFA interpolation model is proposed. Pixels
are divided into edge pixels and non-edge pixels. If the pixel is not in an
edge, then smooth hue transition interpolation is used. For the pixels in the
edge, Edge sensing interpolation algorithm is used. A mathematical model is
proposed for interpolation based on the direction of the edges. Two sample
images are taken and reconstructed by methods such as hue transition
interpolation, edge sensing interpolation and combined approach. PSNR is used to
compare the results of proposed technique. |
Keywords: |
CFA Interpolation, Bayer Filter, Hue Transition, Edge sensing, Adaptive
Interpolation Algorithm |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
STATISTICAL COMPLEXION-BASED FILTERING FOR REMOVAL OF IMPULSE NOISE IN COLOR
IMAGES |
Author: |
P.VENKATESAN, Dr.S.K.SRIVATSA |
Abstract: |
A statistical complexion-based filtering techniques, named as the Adaptive
Statistical Complexion based Filtering techniques (ASCF), is presented for
removal of impulse noise in degraded color images. In distinction with the
traditional noise detection techniques where only 1-D numerical information is
used for noise detection and estimation, an innovative noise detection scheme is
proposed based on statistical personality and features (i.e., the 2-D
information) of the degraded pixel or the pixel region, leading to effective and
efficient noise detection and estimation outcomes. A progressive restoration
mechanism is devised using multipass nonlinear operations which adapt to the
intensity and the types of the noise. widespread experiments conducted using a
extensive range of test color images have shown that the ASCF is advanced to a
number of existing well-known standard techniques, in terms of average image
restoration performance criteria, including objective measurements, the visual
image quality, and the computational complexity. |
Keywords: |
color image restoration, impulse noise detection, progressive filtering. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
ILLUMINATION NORMALIZATION USING LOCAL GRAPH STRUCTURE |
Author: |
HOUSAM KHALIFA BASHIER, LIEW TZE HUI, MOHD FIKRI AZLI ABDULLAH, IBRAHIM YUSOF,
MD SHOHEL SAYEED, AFIZAN AZMAN, SITI ZAINAB IBRAHIM |
Abstract: |
The problem associated with Illumination variation is one of the major
problems in image processing, pattern recognition, medical image, etc; hence
there is a need to handle and deal with such variations. This paper presents a
novel and efficient algorithm for images illumination correction call local
graph structure (LGS). LGS features are derived from a general definition of
texture in a local graph neighborhood. The idea of LGS comes from a dominating
set for a graph of the image. The experiments results on ORL face database
images demonstrated the effectiveness of the proposed method. The new LGS method
can be stabilized more quickly and obtain higher correct rate compare to local
binary pattern (LBP). Finally, LGS is simple and can be easily applied in many
fields, such as image processing, pattern recognition, medical image as
preprocessing |
Keywords: |
Local Graph Structure, Feature Extraction, Pattern Recognition, Illumination
Variation, Local Binary Pattern, Texture Classification. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
FEATURE SELECTION BASED ON THE CLASSIFIER MODELS: PERFORMANCE ISSUES IN THE
PRE-DIAGNOSIS OF LUNG CANCER |
Author: |
K.BALACHANDRAN, DR. R.ANITHA |
Abstract: |
Dimensionality reduction is generally carried out to reduce the complexity
of the computations in the large data set environment by removing redundant or
de-pendent attributes. For the Lung cancer disease prediction, in the
pre-diagnosis stage, symptoms and risk factors are the main information
carriers. Large number of symptoms and risk attributes poses major challenge in
the computation. Here in this study an attempt is made to compare the
performance of the attribute selection models prior and after applying the
classifier models. A total of 16 classifier models are preferred based on
relevancy of the models with respect to the data types chosen, which are based
on statistical, rule based, logic based and artificial neural network
approaches. Feature set selection and ranking of attributes are done based on
individual models. Based on the confusion matrix parameters the models
prediction outcomes are found out in the supervisory training mode. The
Confusion matrix of the models before and after dimensionality reduction is
computed. Models are compared based on weighted Reader Operator Characteristics.
Normalized weights are assigned based for the result of individual models and
predictive model is developed. Predictive models performance is studied with
target under supervised classifier model and it is observed that it is tallying
with the expected outcome. |
Keywords: |
Lung Cancer, Pre-Diagnosis, Data Mining, Artificial Neural Network,
Classifier, Feature Selections |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
EVALUATION OF VARIOUS MULTICARRIER SPWM STRATEGIES FOR SINGLE PHASE SEVEN LEVEL
CASCADED QUASI-Z-SOURCE INVERTER |
Author: |
T.SENGOLRAJAN, B.SHANTHI, 3S.P.NATARAJAN |
Abstract: |
This paper focused on the performance analysis of various Multicarrier
Pulse Width Modulation (MCPWM) strategies with Sinusoidal reference for single
phase seven level cascaded quasi-Z-source inverter (qZSI). It is a new topology
derived from the traditional Z-source inverter (ZSI), employing an impedance
network which couples the source and the inverter to achieve voltage boost and
inversion. The qZSI inherits all the advantages of the ZSI, which can realize
buck/boost, inversion and power conditioning in a single stage with improved
reliability. In addition, the proposed qZSI has the unique advantages of lower
component ratings and constant dc current from the source. The cascaded
quasi-Z-source based MLI strategy enhances the fundamental output voltages
particularly at lower modulation index ranges with reduction in Total Harmonic
Distortion (THD). Performance factors such as %THD, VRMS where measured and CF,
DF of output voltage are calculated for different modulation indices 0.8-1. The
results are compared. The simulation results indicate that the use of
quasi-Z-Source in CMLI boost 45% of the total output voltage. PODPWM strategy
provides low THD and COPWM strategy is found to perform better since it provides
relatively higher fundamental VRMS output voltage. |
Keywords: |
Cascaded Multilevel Inverter (CMLI), Z-source inverter (ZSI),
quasi-Z-source inverters (qZSI), Multicarrier Pulse width modulation (MCPWM),
Total Harmonic Distortion (THD), Shoot-through, Buck-Boost |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
DESIGN AND SIMULATION OF RECONFIGURABLE MULTI WAVELENGTH VCSEL USING MEMS |
Author: |
M.SHANMUGAPRIYA, M.MEENAKSHI |
Abstract: |
This paper describes the novel structure of the multi wavelength
reconfigurable Vertical Cavity Surface Emitting Laser (VCSEL) with tunable Fabry
Perot cavities. This device is designed to operate in the C band. The tunability
is achieved by varying the cavity length of the filter using electrostatic
actuator. The cavity length of the filter is varied to resonate at discrete
wavelengths. The cavity length variation is achieved by using MEMS based
electrostatic actuator. The active layer of VCSEL is modeled with three quantum
wells, and is sandwiched between the DBR mirrors to give high gain. The tunable
portion consists of an array of four filters to give different wavelengths. The
detail analysis of the filter is carried out and the performance is analysed
using Intellisuite software. |
Keywords: |
VCSEL, Fabry Perot Filter, MEMS, Electrostatic Actuator, Tunable Filter |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
CONCEPTION OF THROUGHPUT BY PREVENTING FLOOD ATTACK IN NETWORK LAYER |
Author: |
C.DHIVYA DEVI, G.NANTHA KUMAR, DR. A.AROKIASAMY |
Abstract: |
To achieve the target potential in wireless sensor networks, security has
been creating a major role where it has been used during the communication of
nodes in a specified area, by considering the secure connections between the
nodes during the transmission of packets. As per security as concern, wireless
sensor networks has been deployed in several environments, where providing a
genuine barrier for the malicious nodes in the specified area is a major
challenging task. While there are many disputes, in this article we primarily
focus on security of wireless sensor networks by creating and establishing the
impact of attacker nodes in the simulated area by using aodv protocol which has
been modified by giving values for the creation of malicious nodes through Black
hole AODV. Further, we propose the simulated area of network to attain the level
of security by introducing key generation technique and intrusion detection
system to keep track of monitoring the throughput evaluation which is being
calculated by varying the number of each and every connection by fixing the
number of attackers constant throughout the network simulated area. |
Keywords: |
Hello Flood Attack, Attackers, Ad-Hoc Routing, Security, Authentication,
Intrusion Detection System, Throughput. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
FOREMOST SECURITY APPREHENSIONS IN CLOUD COMPUTING |
Author: |
H. KAMAL IDRISSI, A. KARTIT, M. EL MARRAKI |
Abstract: |
In the recent years, a term keeps coming in the sphere of IT. This new
concept called cloud computing is probably the most noteworthy evolution since
the arrival of the web. Many business leaders have decided to invest in this
architecture in order to gain in terms of saving on material and human
resources. Cloud computing can outsource the IT hub of a company, therefore it
can focus on its own business. Cloud computing is not only full of benefits.
Indeed, it is still subject to several threats related to security which is now
must be implemented at a large scale so the transition to the cloud should be
attended by some adaptations to export all the known security arsenal. Besides,
billing within cloud services does not obey until now to precise and
deterministic rules and many companies are crying foul when they are charged
with an extortionate price. Cloud computing has certainly turned upside down the
practices of business and IT professionals need to review their careers.
However, some still do not dare to step into this architecture and they claim
that more research effort must be done in order to fill the gaps in the cloud.
This paper targets to afford a best description of cloud computing features and
explore guidelines for research and high-tech tendencies in order to use and
implement cloud infrastructure networks. The chief concepts of cloud computer
billing and security threats will be also exposed. |
Keywords: |
Cloud Computing, Large Scale, Focus, Billing, Threat, Security |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
A FIVE-FACTOR SOFTWARE ARCHITECTURE ANALYSIS BASED ON FAR FOR ATM BANKING SYSTEM |
Author: |
T.K.S. RATHISH BABU, DR.N.SANKARRAM |
Abstract: |
Software architecture represents the high level structures of a software
system. It can be defined as the set of structures required to explain about the
software system which comprise the software elements, the relations between
them, and the properties of both the elements and relations. A major challenge
in the software architecture design process is the accurate prediction and
improvement of the software performance characteristics like outage frequency
and duration. This paper proposes hybrid software architecture for an ATM
banking system to overcome the difficulties of an existing architecture. The
proposed system is based on the Fuzzy Association Rules (FAR). During the
extraction of the FAR, a confidence index and theAprioriGen algorithm is
utilized to compute the inverse fuzzy transform. Also, this paper presents a
review of some of the Software Architecture Analysis Methods (SAAM). The
performance of the proposed methodology is analyzed based on the metrics like
reliability, flexibility, adaptability and security. The proposed software
architecture is compared with the various existing software architectures. The
implementation results obviously proves that the proposed methodology performs
better than all the other existing software architectures. |
Keywords: |
Adaptability, Apriori Algorithm, Flexibility, Fuzzy Association Rule
(FAR), Software Architecture Analysis Methods (SAAM), and Security. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
VIDEO REPLICA PLACEMENT STRATEGY FOR STORAGE CLOUD-BASED CDN |
Author: |
SHIJIA. YAO, WENLE ZHOU , HAOMIN CUI AND, MING. ZHU |
Abstract: |
The online video service need the support of CDN(Content Delivery
Networks). Compared with traditional CDNs, it can save a lot of cost by using
cloud-based storage nodes to deliver the video content. To guarantee end users’
QoS, CDN should pre-deploy the content files of online video service to the edge
nodes which are close to the users. Existed researches have shown that the cost
of building CDN by cloud storage nodes is much less than that of using
traditional CDNs. The existed off-line replica placement algorithm named
GS(Greedy Site) can meet the QoS requirement with relatively small cost when the
information of users’ requests is provided. However GS will result in bad load
balance and it need the information of users’ requests. In this paper, two
classes of offline algorithms are proposed. One named GUCP(Greedy User Core
Preallocation) effectively solved the load imbalanced problem caused by GS ,and
the other one named PBP(Popularity Based Placement) which is based on the
popularity of content effectively placed replicas while there is no users’
requests information. Numerical experiments have demonstrated the effectiveness
of the algorithms above. |
Keywords: |
Cloud storage, CDN, Replica Placement, Load balance,QoS |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
AN ALL-INCLUSIVE REVIEW ON VARIOUS TECHNIQUES OF WEB USAGE MINING |
Author: |
P. SENTHIL PANDIAN, Dr.S. SRINIVASAN |
Abstract: |
Numerous users use World Wide Web (WWW) as their default resource for
obtaining knowledge and many organization need to empathize their customer’s
preference, behavior and future need to improve their business. Web usage mining
is a part of web mining and an active research topic. The main goal is to find,
model and analyze the behavioral pattern of the users. The captured patterns are
represented as a collection of objects, or as pages, which are frequently used
or accessed by a set of users having common interest. The primary advantages are
the extraction of segmented data from server logs and discover desired usage
patterns from the web. The importance of application level data can be highly
distinguished from web server data via web usage mining. Enormous outstanding
techniques have been developed to improve the extraction process. This paper
presents a survey of recent methodologies in the field of web usage mining. |
Keywords: |
Data Mining, Web Usage Mining, Behavior Pattern, Preprocessing And
Extraction Technique |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
TOPIC BASED QUERY SUGGESTION USING HIDDEN TOPIC MODEL FOR EFFECTIVE WEB SEARCH |
Author: |
M.BARATHI , S.VALLI |
Abstract: |
Keyword-based web search is widely used for locating information on the
web. But, web users lack sufficient domain knowledge and find it difficult to
organize and formulate input queries which affect search performance. Existing
method suggests terms using the statistics in the documents, query logs and
external dictionaries. This novel query suggestion method suggests terms related
to topics present in the input query and re-rank the retrieved documents. A
generative model, Latent Dirichlet Allocation (LDA) is used to learn the topics
from the underlying documents. The high probability words in a topic are
selected using the Kullback liebler(KL) divergence measure and presented to the
user for suggestion, to enrich the user query and to narrow the search. The
re-ranking technique of this approach uses the initial retrieval position of the
document to re-rank the documents. The suggested queries by the hidden topic
approach and by keyword search are analysed. |
Keywords: |
Latent Dirichlet Allocation, Kullback-Liebler Divergence, Query
Suggestion, Web Search |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
FUZZY LOGIC MPPT CONTROLLER WITH ENERGY MANAGEMENT SYSTEM FOR
SOLAR-WIND-BATTERY-DIESEL HYBRID POWER SYSTEM |
Author: |
K.SEKAR, V.DURAISAMY |
Abstract: |
Effective utilization of power is more important than generation of power
because power scarcity is the major problem at present in India. It leads many
industries to utilize the diesel generator results pollution and demand to
fossil fuel. So nowadays many industries and government passions on renewable
energy. A wind solar hybrid power system plays a crucial role today in renewable
power resources because it uses solar energy combined with wind energy to create
a stand-alone energy source that is both dependable and consistent. This paper
proposes effective energy management controller for solar wind hybrid renewable
power system for telecommunication industries. In power systems apart from power
generation managing of power without wastage is imperative. This paper proposes
Fuzzy Logic Controller based Effective Energy Management Controller to monitor
the power from all resources and load demand consistently and to control whole
hybrid power system. Fuzzy logic controller makes accurate selection of sources
in right timing. Fuzzy Logic Maximum Power Point Tracking is proposed in this
paper for solar and wind power system to provide a constant voltage with the
help of DC-DC Single-Ended Primary-Inductance Converter. Absence of
telecommunication devices per day is unimaginable in current trend. Main
objective of this paper is to supply uninterruptible power for telecommunication
loads from standalone solar-wind-Diesel hybrid power system with efficient
energy storage system. It provides uninterrupted power, effective utilization of
sources, improves life time of battery and minimized usage of diesel. The whole
system is analyzed using MATLAB / Simulink. |
Keywords: |
Hybrid power system (IPS), Fuzzy Logic Controller (FLC), Maximum Power
Point Tracking (MPPT), Single-Ended Primary-Inductance Converter (SEPIC),
Efficient Energy Management Controller (EEMC) and Effective Energy Storage
System (EESS) |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
A VIDEO TRANSCODING USING SPATIAL RESOLUTION FILTER INTRA FRAME METHOD IN
MULTIMEDIA NETWORKS |
Author: |
S.VETRIVEL, DR.G.ATHISHA |
Abstract: |
Video transcoding is a process of converting one form of video into
another form. It provides fine and dynamic adjustment in the bit rate of video
bit stream in the compressed domain without imposing additional function in the
decoder. In that, H.264 is a successful video coding technique to address a
large range of applications, bit rates resolutions qualities and services. Down
sampling is special technique used in H.264 to reduce sampling rate of a signal
and spatial resolution filter is an optical devices which uses the principles of
Fourier options to alter the structure of a coherent light or other
electromagnetic radiation, it also one of the technique of spatial domain.
Intra-frame is used in video coding (compression) in the video sequence and the
H.264/AVC intra-frame decoding and encoding contain a set of
computation-intensive coding tools forming a loop in which the data are strongly
dependant. We found that in H.264 quality of motion picture degrades so we
propose Spatial Resolution Filter to improve the quality. Our experimental
result shows that the high computation time saving can be achieved with only
negligible quality degradation. |
Keywords: |
Transcoder, H.264, Down sampling, Spatial Resolution Filter, Intra frame,
Motion refinement |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
SYSTEMATIC LITERATURE REVIEW (SLR) AUTOMATION: A SYSTEMATIC LITERATURE REVIEW |
Author: |
ZUHAL HAMAD, NAOMIE SAlIM |
Abstract: |
Context: A systematic literature review(SLR) is a methodology used to find
and aggregate all relevant studies about a specific research question or topic
of interest. Most of the SLR processes are manually conducted. Automating these
processes can reduce the workload and time consumed by human.
Method: we use SLR as a methodology to survey the literature about the
technologies used to automate SLR processes.
Result: from the collected data we found many work done to automate the study
selection process but there is no evidence about automation of the planning and
reporting process. Most of the authors use machine learning classifiers to
automate the study selection process. From our survey, there are processes that
are similar to the SLR process for which there are automatic techniques to
perform them.
Conclusion: Because of these results, we concluded that there should be more
research done on the planning, reporting, data extraction and synthesizing
processes of SLR. |
Keywords: |
SLR, Automation, Planning, Reporting, Data Extraction, Synthesizing |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
QOS-BASED RANKING MODEL FOR WEB SERVICE SELECTION CONSIDERING USER REQUIREMENTS |
Author: |
G. VADIVELOU, E. ILAVARASAN |
Abstract: |
Web services are widely used in the business application development to
achieve interoperability among standalone systems. Efficient and effective
techniques are required to find and select required services among similar
services which is an important task in the service-oriented computing. Ranking
process which is part of Web service discovery system helps the users to find
the desired services effectively. The existing research contribution for ranking
process does not consider the user’s requirements which are an important factor
to rank web services. In this work, vector-based ranking method is enhanced to
consider user’s requirements. The vector-based model is selected because of its
simplicity and high efficiency. The web services are evaluated on the basis of
their similarity degrees to the optimal or the best values of various quality
attributes. Experiments are conducted with real dataset and the improved
algorithm is compared with the other approaches and it is found that the
enhanced vector-based ranking method is efficient in terms of execution time to
return the result set. |
Keywords: |
SOA, Web Services Selection, Web Services Ranking, Vector-Based Ranking,
QoS |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
AN EFFICIENT CLUSTER BASED ROUTING IN WIRELESS SENSOR NETWORKS |
Author: |
K.E.KANNAMMAL, T.PURUSOTHAMAN, M.S.MANJUSHA |
Abstract: |
Wireless Sensor Network (WSN) consists of hundreds or thousands of sensor
nodes which have limited energy, computation and memory resources. These sensors
are randomly deployed in a specific area to collect useful information
periodically for few months or even few years. The applications of WSN in some
extreme environment make sensor nodes difficult to replace once the battery
lifetime expires. Since the wireless transmission is the most energy consuming
operation, designing an energy efficient routing protocol becomes the main goal
for the wireless sensor network. LEACH is considered as the most popular routing
protocol which has better performance in saving the energy consumption. However,
the cluster head choosing formula neglects the change of node’s energy will make
the nodes acting as cluster heads too many times leading the cluster head die
early by consuming too much energy. This paper presents K means clustering
approach for clustering and a two level fuzzy logic approach to Cluster Head
(CH) election based on four parameters namely - number of neighbor nodes,
remaining energy, energy dispersion and distance from the base station. The
proposed method has proved that it prolong the network lifetime than the LEACH
protocol by comparison simulations using Matlab. |
Keywords: |
Wireless Sensor Networks, Clustering, LEACH, K means, Fuzzy Logic |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
ELECTRONIC POWER OF ATTORNEY PROTOCOL BY USING DIGITAL SIGNATURE ALGORITHM |
Author: |
WALIDATUSH SHOLIHAH, SUGI GURITMAN, HERU SUKOCO |
Abstract: |
In this paper, we focused on making electronic power of attorney protocol.
Power of attorney is a letter that authorizes the holder to carry out specified
power given by the grantor. Generally power of attorney consist of the giver and
holder identity, the contents of the letter and the signatures of the parties.
The signature was made by using a tool such as pen. This power of attorney is
certainly less efficient in time, resource, and security as well. Conventional
signature (made with pen or other stationery) was easily faked and still using
the naked eye for verification. These weaknesses can be overcome with the
electronic power of attorney. Electronic power of attorney is using a digital
signature for the giver and the holder. The signature used is Digital Signature
Algorithm (DSA). Power of attorney in paper media can be replaced with
electronic power of attorney. Signature on a power of attorney in paper media
was replaced with a digital signature. Electronic power of attorney created
using DSA algorithm according to the prevailing power of attorney in Indonesia.
Electronic power of attorney protocols are designed to meet the security
criteria of a power of attorney. |
Keywords: |
Digital Signature Algorithm, Electronic, Power of Attorney, Protocol,
Signature |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
FPGA BASED ADAPTIVE RESOURCE EFFICIENT ERROR CONTROL METHODOLOGY FOR NETWORK ON
CHIP |
Author: |
M.DEIVAKANI, D.SHANTHI |
Abstract: |
This research work proposes resource efficient and secured network on chip
router using error control schemes. The proposed method combines the Cipher
block encryption based parallel crossbar methodologies of the NoC data link and
network layers to efficiently gives error control strength in variable network
topology conditions. The proposed method significantly minimizes hardware
utilization when compared to other previous works. This can be achieved by
implementing parallel cross bar architecture with Cipher block based ECC coding
method in NoC. The proposed system uses Modelsim software for simulation
purposes and Xilinx Project Navigator for synthesis purposes. |
Keywords: |
Error Control, Cipher, Data Link, Residual Packet And Interleaving. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
SEMANTIC ENRICHMENT OF QUERIES WITH GENERIC AND SPECIFIC TERMS IN THE DEFINITION
SENTENCES |
Author: |
MOHAMED RACHDI, EL HABIB BEN LAHMAR, EL HOUSSINE LABRIJI |
Abstract: |
Increasing the relevance of the results of research tools remains a real
challenge for users and researchers in the field of information retrieval. To
overcome this problem, several studies have been conducted, mainly in the
methods and techniques that focus on the treatment and the reformulation of
queries, to meet the needs of the users. The purpose of this paper is to
contribute to the improvement of approaches to reformulate queries by semantic
enrichment based on definition sentences.
Definitions and terminology have several features and components that can be
used in information retrieval, especially in the enrichment of queries. The
essential components of the definition sentences are generic and specific terms.
Our approach consists of exploiting the definition sentences making use of their
generic and specific terms. |
Keywords: |
Semantic Enrichment, Relevance, Queries, Generic, Spécific |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
PROBABILISTIC KNOWLEDGE BASE SYSTEM FOR FORENSIC EVIDENCE ANALYSIS |
Author: |
NOOR MAIZURA MOHAMAD NOOR, SALWANA MOHAMAD @ ASMARA, MD YAZID MOHD SAMAN,
MUHAMMAD SUZURI HITAM |
Abstract: |
Crime investigation is a complex task involving huge amounts of information and
requiring many different types of expert knowledge. To improve policies and
developing effective crime investigation strategies, it is important to
understand the processes behind crime. Besides, uncertainty is a common problem
in crime investigation. Uncertainties of knowledge in forensic investigation
gives a great impact to the decision making process entirely. This paper
discusses on representing and calculating probabilistic knowledge for forensic
evidence analysis in assisting crime scene investigation. In order to facilitate
forensic evidence analysis, a knowledge base system and Bayesian networks have
been developed. It describes the process of calculating the probabilities for
forensic evidence analysis. In conclusion from the obtained results, it shows
that the developed knowledge base can support decision making for uncertainties
knowledge in forensic evidence analysis. |
Keywords: |
Knowledge Base System, Probabilistic, Crime Investigation, Forensic Evidence
Analysis |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
COMPARING FUZZY LOGIC AND FUZZY C-MEANS (FCM) ON SUMMARIZING INDONESIAN LANGUAGE
DOCUMENT |
Author: |
DWI ADE RIANDAYANI, I KETUT GEDE DARMA PUTRA, PUTU WIRA BUANA |
Abstract: |
Text summarization is one way to get the information which is quite effective.
Approach techniques of the text summarization are classified into two, namely
abstractive and extractive. This paper is focused on the extractive techniques
so that the output of the application is important sentences which are taken
from the original text intact without modification verbally. There are four
majors processes in this research namely preprocessing stage, scoring features,
optimization of summarization by two methods (Fuzzy Logic and Fuzzy C-Means),
and extraction of summary results. This research use 7 features to calculate the
score of each sentence. The core of this research is to compare reliability
Fuzzy logic and Fuzzy C-Means method in optimizing process of summary results.
The results showed that the method of Fuzzy Logic is outperformed Fuzzy C-Means
method with the closest similarity ratio to summarize manually by humans with
percentage accuracy of Fuzzy Logic is 58.25% while Fuzzy C-Means method is
54.33%. Increased accuracy occurs when there is additional compression rate in
the amount of 1.67% for Fuzzy Logic Method and 4.5% for Fuzzy C-Means. |
Keywords: |
Summarization, Text Features, Scores Sentences, Fuzzy Logic, Fuzzy C–Means |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
NOVEL OPTIMIZATION TECHNIQUE FOR CLASSIFICATION OF REMOTE SENSING DATA USING SVM |
Author: |
SAKTHI. G, R. NEDUNCHEZHIAN |
Abstract: |
Remote sensing data is a collection of images and interpretation of information
about an object, area, or event without any physical contact with it. Aircraft
and satellites are common remote sensing platforms for earth and its natural
sources. Remote sensing’s ability to identify and monitor land surfaces and
environmental conditions expanded over years with remote sensed data being
essential in natural resource management. Machine learning is used for
classification of the remote sensed images. This study uses Support Vector
Machine (SVM) with Radial Basis Function (RBF) for classifying Remote Sensing
(RS) images. RBF kernel improves classification accuracy. This study proposes
SVM-RBF optimization with Cuckoo Search (CS) for remote sensing data
classification. |
Keywords: |
Remote Sensing (RS) images, Salinas’s dataset, Support Vector Machine (SVM),
Cuckoo Search (CS), Kernel Optimization |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
CRITICAL REVIEW OF THE EFFORT RATE VALUE IN USE CASE POINT METHOD FOR ESTIMATING
SOFTWARE DEVELOPMENT EFFORT |
Author: |
APOL PRIBADI SUBRIADI, SHOLIQ, PUJI AGUSTIN NINGRUM |
Abstract: |
Effort estimation is an activity to estimate the number of business activities
of workers as well as how long it takes to accomplish a software development
project. This estimation is very important to be able to know how much the
relevant value of software generated. One common method used to calculate the
estimated effort is the Use Case Point (UCP).
This research aim to review the UCP method, proposed by Karner in 1993, which is
only based on three software development project data. However, until now, most
researchers still refer to the value of the proposed Estimate Rate (ER) by
Karner without questioning its relevance. In the UCP method, the estimated
effort obtained from multiplying the UCP Value by the ER Value. ER Value is the
ratio of staff-hours required to accomplish each UCP. Karner proposed ER Value
was 20 staff-hours.
The final result of this research, the ER Value is equal to 8.2. This value is
much smaller than the ER Value proposed by Karner. This is possible due to
several reasons: 1) the existence of software engineering methods, 2) the more
advanced software engineering technologies, 3) the software by component, 4) the
source availability in the internet. |
Keywords: |
Effort Rate, Use Case Point, Software Development Effort |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
MULTI-OBJECTIVE ARTIFICIAL BEE COLONY (MOABC) ALGORITHM TO IMPROVE CONTENT-BASED
IMAGE RETRIEVAL PERFORMANCE |
Author: |
ANIL KUMAR MISHRA, Dr.MADHABANANDA DAS, DR.T.C.PANDA |
Abstract: |
Multi-objective optimization has been a difficult area and focus for research in
fields of image processing. This paper presents anoptimization algorithm based
on artificial bee colony (ABC) to deal with multi-objective optimization
problems in CBIR. We have introduce to multi-object ABC algorithms is based on
the intelligent scavenging behaviour for content base images. It uses less
control parameters, and it can be efficiently used for solving for multi object
optimization problems. In the current work, MOABC for discrete variables hasbeen
developed and implemented successfully for the multi-objective design
optimization of composites. The proposed algorithm is corroborated using the
standard test problems, and simulation results show that the proposed approach
is highly competitive and can be considered a viable alternative to solve
multi-objective optimization problems.Finally the performance is evaluated
incomparison with other nature inspired techniques which includes
Multi-objective Particle Swarm Optimization (MOPSO) and Multi-objective Genetic
Algorithm (MOGA). The performance of MOABC is better as par with thatof
MOPSO,MOGA and ABC for all the loading configurations. |
Keywords: |
Multi-objective optimization, Structural optimization, Artificial Bee Colony
(ABC), Feature Extraction. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
AUTOMATIC FACE RECOGNITION APPROACHES |
Author: |
LADISLAV LENC, PAVEL KRÁL |
Abstract: |
This paper deals with automatic face recognition, which means to use a computer
for automatic identification of a person from a digital image or from a video
frame. This field became intensively studied in the last two decades. Concerning
other biometrics methods, automatic face recognition seems to be one of the most
important ones. Therefore, automatic face recognition is used in many
applications as for example access control to restricted areas, surveillance of
persons, various programs for sharing and labelling of photographs, social
networks and many others. The main goal of this paper is to review most
important face recognition approaches with their theoretical and practical
advantages and drawbacks. We further evaluate and compare these approaches with
each other. We conclude that it is generally not possible to identify a best
performing face recognition approach and that the choice of the optimal method
is strictly related to the target application. We assume that the future
research directions will address the main issue of the current approaches,
insufficient recognition accuracy in the totally uncontrolled environment. |
Keywords: |
Approaches Comparison, Face Database, Face Recognition, Personal Identification,
Review |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
STATISTICAL BASED OUTLIER DETECTION IN DATA AGGREGATION FOR WIRELESS SENSOR
NETWORKS |
Author: |
U.BARAKKATH NISHA, N.UMAMAHESWARI, R.VENKATESH, R.YASIR ABDULLAH |
Abstract: |
Inconsistent data caused by compromised nodes in Wireless Sensor Networks can be
detached to improve data reliability, accuracy and to make effective and correct
decisions. Multivariate Outliers normally describe the data behavior
abnormality. Data aggregation is frequently used for the reduction of
communication overhead and energy expenditure of sensor nodes during the process
of data collection in Wireless Sensor Networks and also to improve the lifetime
of the WSN. For the delivery of accuracy in base station, the outlier detection
protocol must be incorporated with secure data aggregation. Aggregation will
also try to increase the circle of knowledge and the level of accuracy. In this
paper we use multivariate data analysis technique, data to handle outlier in
correlated variables. To achieve the reliability and accuracy, a two phase
algorithm is proposed. First, to build up a well conditioned PCA model for fault
detection. Second, we use various statistical techniques to determine similarity
between the sensed data against the real data set. We have evaluated our
algorithm based on synthetic and real data injected with synthetic faults
collected from a WSN. Our results concludes that the proposed algorithm achieves
high true alarm rate and low false alarm rate and outperforms all the existing
methods in terms of data accuracy and reliability. |
Keywords: |
Wireless Sensor Network (WSN), Aggregation, Multivariate Outlier, Well
Conditioned PCA, Mahalanobis Distance (MD), Minimum Volume Ellipsoid (MVE),
Minimum Covariance Determinant (MCD), Minimum Generalized Variance (MGV) |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
PERFORMANCE ANALYSIS OF WAVELET & BLUR INVARIANTS FOR CLASSIFICATION OF AFFINE
AND BLURRY IMAGES |
Author: |
AJAY KUMAR SINGH, V P SHUKLA, S R BIRADAR, SHAMIK TIWARI |
Abstract: |
Image degradation occurs while acquisition because of so many reasons for
example low illumination, noise, occlusion etc. Geometric distortion and
radiometric degradations are also one of the widespread difficulties in computer
vision. This paper presents a system to classify multi class images deformed due
to geometrical transform, blur contamination or the combination of both.
Different blur and affined invariant moment descriptors in spatial domain are
used to tackle this problem, which are invariant to centrally symmetric blurs.
In this paper, performance of the proposed system is analyzed in contrast to
wavelet feature based system. The performance of the system is demonstrated
through various experiments. Experimental results exhibits that method is
effective and computationally inexpensive and can be applied to images having
several geometrical and blur degradation in the same image. |
Keywords: |
Blur invariant moment; Neural Network; Gaussian Blur; Affine Transform;
Multiclass Classification |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
A NEW HYBRID APPROACH FOR PREDICTION OF MOVING VEHICLE LOCATION USING PARTICLE
SWARM OPTIMIZATION AND NEURAL NETWORK |
Author: |
BABY ANITHA, E, DR.K.DURAISWAMY |
Abstract: |
The modeling of moving objects can attract the lots of research interests. The
moving objects have been developed as a specific research area of Geographic
Information Systems (GIS). The Vehicle movement location prediction based on
their spatial and temporal mining is an important task in many applications.
Several types of technique have been used for performing the vehicle movement
prediction process. In such a works, there is a lack of analysis in predicting
the vehicles location in current as well as in future. Accordingly we present an
algorithm previously for finding optimal path in moving vehicle using Genetic
Algorithm (GA). In the previous technique there is no complete assurance that a
genetic algorithm will find an optimum path. This method also still now needs
improvement for optimal path selection due to fitness function restricted to
prediction of complex path. To avoid this problem, in this paper a new moving
vehicle location prediction algorithm is proposed. The proposed algorithm mainly
comprises two techniques namely, Particle Swarm optimization Algorithm (PSO) and
Feed Forward Back Propagation Neural Network (FFBNN). In this proposed
technique, the vehicles frequent paths are collected by watching all the
vehicles movement in a specific time period. Among the frequent paths, the
vehicles optimal paths are calculated by the PSO algorithm. The selected optimal
paths for each vehicle are used to train the FFBNN. The well trained FFBNN is
then utilized to find the vehicle movement from the current location. By
combining the PSO and FFBNN, the vehicles location is predicted more
efficiently. The implementation result shows the strength of the proposed
algorithm in predicting the vehicle’s future location from the current location.
The performance of the new algorithm is evaluated by comparing the result with
the GA and FFBNN. The comparison result demonstrates the proposed technique
acquires more accurate vehicle location prediction ratio than the GA with FFBNN
prediction ratio, in terms of accuracy. |
Keywords: |
Moving Vehicle Location Prediction, Particle Swarm Optimization Algorithm Feed
Forward Back Propagation Neural Network, Frequent Paths, Genetic Algorithm,
Optimal path. |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
EFFICIENT TECHNIQUE TOWARDS THE AVOIDANCE OF REPLAY ATTACK USING LOW DISTORTION
TRANSFORMS |
Author: |
M.MANJU, Dr.V.KAVITHA |
Abstract: |
The huge number of fingerprints collected by the Federal Bureau of Investigation
(FBI) has created an enormous problem in storage and transmission. Increase of
online communication and transactions, has led to the requirement of security
and privacy measures to be inbuilt in fingerprint recognition system. There are
several solutions already in use to protect confidential information and to
authenticate people electronically. When biometrics is used, it often deals with
a discussion concerning privacy and integrity. This paper proposes a new
methodology for effectively avoiding the replay attack during transmission of
minutiae data. The minutiae information is split in to two parts and the first
part is encrypted using pseudo-random permutation and then the remaining part of
the minutiae information is embedded into the resultant permuted sequence using
a low distortion based digital watermarking transform technique. To prove the
integrity of the transmitted data, the receiver performs the same operation as
that of the sender and compares the obtained data with the received data. Thus
this algorithm gives solution to the replay attack which is found most common in
all ATM based applications. |
Keywords: |
Pseudo-Random Permutation, Digital Watermarking, Image Encryption, Low
Distortion Transform |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
Title: |
SERVICE BASED FAIR RESOURCE ALLOCATION MODEL (SbFRAM) IN WiMAX |
Author: |
KHALID MAHMOOD AWAN, ABDUL HANAN ABDULLAH, KHALID HUSSAIN |
Abstract: |
Two broader categories of the services are introduced in the paradigm of WiMAX.
In IEEE 802.16 standard has added these services and called them as Constant Bit
Rate (CBR) and Variable Bit Rate (VBR). Both of these services designed for real
time and non real time traffic. As per literature available there is a gap to
provide additional resources to fulfill the required service class from the
user. In this paper we propose a Service Based Fair Resource Allocation Model (SbFRAM),
for evaluating the required service from Subscriber (SS) along with the channel
condition. Our proposed model evaluates, for providing the required service to
SS how much additional resource will be required. In this model we introduced
Priority Queue Scheduling Methodology for providing additional resources as per
channel condition. In this paper we made comparison with and without proposed
model on both traffic, CBR and VBR. We experiment that our proposed model
manages the user request by providing them extra resources as required for
satisfaction. Their request handled in Priority Queue Scheduling based
mechanism. Results shows, that we achieve improvement by providing the
additional resource on fair scheduling basis. To achieve the required
performance for fairness we have to compromise on delay. |
Keywords: |
Resource Allocation, SNR, CBR, VBR, Priority Queue |
Source: |
Journal of Theoretical and Applied Information Technology
January 2014 -- Vol. 59. No. 3 -- 2014 |
Full
Text |
|
|
|