|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
November 2015 | Vol. 81 No.1 |
Title: |
IMPLEMENTATION IN DECISION SUPPORT SYSTEM OF ONTOLOGICAL APPROACH TO EXTRACTION
OF DATA ABOUT INFORMATION OBJECTS FROM NEWS FLOW |
Author: |
ALEKSANDR VALEREVICH VOLKOV, DMITRII ALEKSANDROVICH SYTNIK |
Abstract: |
The objective of this article is the study of existing approaches to extraction
and structuring of data about information objects from news flow and development
of the original approach to solution of that task. The article covers the
approach to extraction of data about information objects based on domain
otology. The developed model of information objects extracted from text is
described, as well as the complex of linguistic resources applied for
implementation of extraction from text of data about various types of
information objects. This article gives a brief description of processing. The
stage of extraction of data about information objects includes three phases:
named entity recognition; inter-object relationship extraction; building complex
information objects (events). For extraction of information from text,
rule-based approach is used. |
Keywords: |
Ontology, Text Extraction, Information Extraction (IE), Rule-Based Extraction,
Event Extraction, Natural Language Processing, Unstructured Data, Unformalized
Data, Decision Support System, News Processing, Analysis Of News Flows, Mass
Media Monitoring. |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
ALGORITHMIZATION OF SEARCH OPERATIONS IN MULTIAGENT INFORMATION-ANALYTICAL
SYSTEMS |
Author: |
ANASTASIA GENNADIEVNA ANANIEVA, ALEXEY ANATOLIEVICH ARTAMONOV, ILYA URIEVICH
GALIN, EVHENIY SERGEYEVICH TRETYAKOV, DMITRY OLEGOVICH KSHNYAKOV |
Abstract: |
The exponential growth in the number of data sources and data on the Internet
leads to necessary selection of optimal searching engines for data collection
that corresponds to user’s request. The task to create searching algorithms lies
primarily in the decomposition of data types, selection of appropriate
information sources and adjustment of searching engines (usage of specific
syntax) for collection of large amounts of data from reliable sources. Since
2008 at the department “Analysis of competitive systems” NRNU MEPhI are held
activities on setting up specific agent-based searching systems and creation of
unique searching algorithms and techniques for analyzing the output
technological developments related to advances in the area of scientific and
technological sciences. In particular, the resource cluster was collected and
thesauruses were compiled on such advanced scientific fields as “Plasma
physics”, “Nanotechnology” and “Laser technology”.This article describes the
algorithms, approaches and methods of data search retrieval from the Internet
that are developing and testing at the department №65 “Analysis of competitive
systems” NRNU MEPhI under the “multi-agent information and analytical systems in
the area of scientific and technology science” MIAS [1]. |
Keywords: |
Multi-agent system, Big Data, Data mining, retrieval system, information and
analytical system |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
DEVELOPMENT OF THE MATHEMATICAL MODEL FOR ANALYSIS OF LEVELING POINTS STABILITY
OF REFERENCE LEVELING GRID |
Author: |
IBRAGIM GILANIEVICH GAIRABEKOV, HASAN ELIMSULTANOVICH TAYMASKHANOV, MAGOMED
SALAMUVICH SAYDUMOV, SHARPUTDI SHAMSUTDINOVICH ZAURBEKOV, BERS MOVSAROVICH
KHASAEV, MAGOMED SHAVALOVICH MINTСAEV |
Abstract: |
The main drawback of the current methods used for analyzing stability of
leveling points in the reference leveling grid have been defined. The problem of
developing a mathematical model of leveling points in the reference leveling
grid is being solved. Equal accuracy of measurement in different cycles is
expected. Possible subsidences between the reference points are represented by a
polynomial of the second degree. In solving this problem, the polynomial
coefficients are calculated using the least squares method. Accuracy of the
obtained mathematical model used to make a conclusion about the need to identify
elevations between the reference points with high accuracy (up to tenths of a
millimeter) has been estimated. |
Keywords: |
Hyperbolic Equation, Nonlocal Integral Condition, Bessel Operator, Mathematical
Model, Reference Point, Leveling Grid, Elevation, Subsidence, Polynomial,
Differentiation, Model Error |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
APPROACH TO THE ASSESSMENT OF UNIVERSITY GLOBALIZATION INDEX |
Author: |
MAXIM V. SERGIEVSKIY, KONSTANTIN S. ZAYTSEV |
Abstract: |
It is impossible to imagine the present-day world without changes of politics,
economics, social life caused by globalization. The globalization within society
influences and updates the system of education quite naturally that is a factor
of competitiveness advance and of the modern nation development. That is why in
order to determine the general level of nation globalization it is necessary to
determine the globalization level of education which is a boundary zone between
social life and culture. The purpose of this article is to analyze the
informative aspect of factors influencing on integration processes of university
education and to formulate an approach for quantitative estimation of
globalization index of education of a separate university and of a set of
universities. Quality of education as well as its involvement in world global
processes is mainly determined by curriculum content and first of all by the set
of subjects composing this curriculum. The absence of subjects that are of great
importance for the particular scientific sphere is this very factor that reduces
the quality of education. To a lesser degree the quality of education also
depends on the way the academic activity is organized including teachers’
qualifications and the way classes are held. That is the reason why at this work
we describe an approach to the evaluation of the globalization level of a
separate university, which is based on the comparative analysis of its
curriculum content of different subjects with the curriculum content of the
world leading universities, or the so-called “ideal curriculum”. In order to
calculate the globalization index of education in the informative aspect we have
introduced a relative additive criterion that uses weighted values of subjects
included in the overall ranking. |
Keywords: |
The Globalization Index Education, Globalization Systems Education,
Modernization Of Education, Quality Of Education, Content Of Education,
Curriculum, Information Technology |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
THE EXPERIMENTAL STUDIES OF THE JPWL TOOLS ABILITY TO CORRECT BURST ERRORS IN A
NOISY CHANNEL WHEN TRANSMITTING VIDEO IN A JPEG 2000 FORMAT |
Author: |
SERGEY VASILIEVICH SKOROKHOD |
Abstract: |
The discussion is made on the issue of transmitting video in a JPEG 2000 format
using the JPWL tools (Wireless JPEG 2000) for noise-immunity coding through a
noisy channel under conditions of burst errors occurrence. The major problem is
the restoration of the lost packet using the JPWL tools. It is set out to
conduct an experimental study on the ability of JPWL tools to correct burst
errors. The methodology of the study consists in modelling a transmission of the
JPWL protected video composed of 1,000 frames of one and the same image with a
size of 1,024 x 768 pixels. The variable parameters of the study are as follows:
the Reed-Solomon codes applied for protection, the number of image tiles, and
the RTP (Real-time Transport Protocol) packet loss ratio. A standard variant of
protection and a combination of the standard variant with an interleaving
algorithm are considered. The final results of the experiment are the average
values of the percentage of fully and partially restored tiles of the number of
tiles in a code stream. A software system developed for the conduction of study
is described, which includes the JPWL encoder and decoder, the tools for
partitioning code streams into RTP packets, the tools for packet loss modelling,
and the tools for frame assembly from the RTP packets. A macro flowchart of the
JPWL encoder and decoder functioning is described. A method for the code stream
interleaving is suggested, which is intended for increasing its resistance to
burst errors. During the study process, the two hypotheses were put forward,
which were confirmed by the experimental results obtained. The first hypothesis
is that the standard JPWL tools are not able to correct burst errors. The second
hypothesis is that the standard JPWL tools used in conjunction with the
interleaving algorithm can recover burst errors. A side effect of the study is
the conclusion on a slight improvement of the recoverability of tiles with the
increasing number of tiles in a code stream. |
Keywords: |
JPEG 2000, JPWL, Burst Errors, Noise-Immunity Coding |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
CROSS-RELATIONAL STUDY BETWEEN INTELLIGENCE AND BRAIN ASYMMETRY ABILITIES USING
EEG-BASED IQ CLASSIFICATION MODEL |
Author: |
AISYAH HARTINI JAHIDIN, MEGAT SYAHIRUL AMIN MEGAT ALI, MOHD NASIR TAIB,
NOORITAWATI MD. TAHIR, AHMAD IHSAN MOHD YASSIN |
Abstract: |
EEG is an established technique that has been widely implemented in brain
research. Recent innovations in signal processing approaches have allowed
implementation of EEG beyond the clinical settings. It has been widely
acknowledged that each individual exhibits multi-facet potential which is rooted
in the brain. Hence, there is a possibility that these abilities are
inter-related with each other. This study proposes to map the relationship
between intelligence quotient (IQ) and brain asymmetry (BA) using EEG and
intelligent signal processing approach. EEG-based ANN modelling has been
previously developed and enhanced. The model is then implemented to predict
three levels of IQ (low, medium and high IQ) from 51 samples with distinct BA
indices. The indices are derived from Alpha asymmetry score. Mapping between IQ
and BA revealed that through a systematic technique; significant relationship
exists between distinct IQ levels and symmetrical brain indices. Findings have
demonstrated that the best performance occur when subjects maintain relatively
balanced control between the two hemispheres. Hence, by implementing feedforward
neural network model based on EEG power ratio features, the attempt to relate IQ
with BA has been realised with promising results. |
Keywords: |
Artificial Neural Network (ANN), Brain Asymmetry (BA), Electroencephalogram
(EEG), Intelligence Quotient (IQ), Power Ratio |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
AUTHENTICATION MECHANISM FOR CLOUD NETWORK AND ITS FITNESS WITH QUANTUM KEY
DISTRIBUTION PROTOCOL: A SURVEY |
Author: |
ROSZELINDA KHALID, ZURIATI AHMAD ZUKARNAIN, ZURINA MOHD HANAPI, MOHAMAD AFENDEE
MOHAMED |
Abstract: |
Communication in a huge network such as in cloud infrastructure is vast in
demand. Almost all classified information transferred via the communication
channel. Most types of attack can cause the classified information be in the
hand of unauthorized party. This situation will lead to information disclosure
and the user feel unsecured for using the services like offered by cloud
infrastructure. Besides, we look into authentication as a primary concern; we
also pay attention to the secure communication channel. We believe, with a
secure communication channel additional with a secure authentication we may
reduce the possibility of attack that may lead to information disclosure. This
paper summarizing the issue and challenge facing in cloud authentication
mechanism. We also review existing technique of authentication mechanism for
cloud network and discuss important issues in this field such threat and
insecure scheme, whereby the root of these kinds of issue originates from the
weak-ness of the authentication mechanism on the security channel being using.
After all, we found the implementation of quantum key distribution scheme in an
enormous network such as cloud infrastructure may resolve the issue interception
of unauthorized party in the network. A study investigating how the use of
quantum key distribution key, can guarantee that the message has not been
modified or replaced by a dishonest party with control of the communication
line. |
Keywords: |
Cloud Authentication, Multi-Party Quantum Key Distribution, Secure Communication |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
NEW MODEL OF FRAMEWORK FOR TASK SCHEDULING BASED ON MOBILE AGENTS |
Author: |
YOUNES HAJOUI, MOHAMED YOUSSFI, OMAR BOUATTANE, ELHOCEIN ILLOUSSAMEN |
Abstract: |
Compute-intensive applications and applications with high volume of data need
strong processing power and considerable storage resources. To reach the
required performance, multiple machines should be associated in order to handle
the distributed tasks. In this paper, we propose a new framework for task
distribution based on mobile agents. In the proposed model, a dispatcher agent
is used to distribute parallel tasks to worker agents. Each worker agent is
deployed in a node of the distributed system according to the load balancing
system. The proposed framework is build using three layers which are the user
task producer, the scheduling load balancing layer and the workers layer. After
presenting the architecture and the structure of the proposed model, an example
of application, relating to the distributed image processing, is presented to
improve the performance of this framework. |
Keywords: |
Task scheduling; Parallel Computing; Distributed System; Framework; Multi-Agent
system; Load balancing. |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
AUTOMATED TRANSFORMATION APPROACH FROM USER REQUIREMENT TO BEHAVIOR DESIGN |
Author: |
NURI JAZULI KAMARUDIN, NOR FAZLIDA MOHD SANI, RODZIAH ATAN |
Abstract: |
System design is an important process in the development of any type of computer
related system and a process of gathering user requirement of the system that
will be developed. System design is important in software development because it
assists the developer to develop a system according to what are required by the
user. One of main components in system design is Unified Modeling Language (UML)
diagram. UML diagram is use as a model to show all important functions, process,
flows, actors, classes that related to the system that will be built. This
research is focusing on two types of diagram from behavior diagram which are Use
case diagram and Activity diagram. Many developers have come out with a tool to
help the analyst to draw the diagrams in the computer. Tools like Rational Rose,
Lucidchart, UMLet are an example of the tool that have been developed. This step
is still not preferred by system analyst because drawing use case diagram and
activity diagram takes time and it must be done manually. Model Transformations
is one of new finding in system design that intended to ease system analyst in
modeling system design. This process is very useful in helping the analyst to
reduce time in drawing use case and activity diagram. In this paper, we propose
an approach to automatically transform user requirement into behavior model (Use
Case diagram, Activity Diagram) and to develop a tool that will enable the
transformation of the requirement into behavior model. |
Keywords: |
Model transformation (MT), UML model, System Design |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
LOCAL BINARY PATTERNS AND MODIFIED RED CHANNEL FOR OPTIC DISC SEGMENTATION |
Author: |
NUR AYUNI MOHAMED, MOHD ASYRAF ZULKIFLEY, AINI HUSSAIN, AOUACHE MUSTAPHA |
Abstract: |
Glaucoma is one of the ocular eye diseases that can cause gradual vision loss
and permanent blindness if it is not treated in the early stage. Current
screening test such as intraocular pressure (IOP) assessment is not efficient
since eye pressure is not the only symptom of glaucoma. The most suitable
assessment of the glaucoma is by analyzing the health of the optic nerve head.
In order to quantify the severity level of glaucoma, an automated detection
system is developed by examining the optic disc and optic cup size. This paper
explores two methods for optic disc segmentation, a part of modules in automated
detection system, which are local binary patterns (LBP) and modified red channel
(MRC). Both methods utilized only the red channel of RGB format fundus image as
it alone is enough to achieve good performance in term of image contrast as
compared to the other channels. For each method, preprocessing is first
performed to enhance the quality of the input fundus image and post-processing
is performed to smooth out the segmented boundary of the optic disc. RIM-One
database is used to validate the simulation results for both tested methods. The
results show that MRC performance is more stable in wider conditions compared to
LBP. In conclusion, both methods segment the optic disc boundary with high
accuracy, which can be used to calculate cup-to-disc ratio to determine severity
of glaucoma. |
Keywords: |
Local Binary Pattern, Textural Classification, Glaucoma, Fundus Image and Disc
Segmentation |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
A REVIEW ON TOOLS OF RISK MITIGATION FOR INFORMATION TECHNOLOGY MANAGEMENT |
Author: |
BOKOLO ANTHONY JNR, NORAINI CHE PA |
Abstract: |
Information technology (IT) industries are facing operational. Technical and
strategic risks that make IT practitioners miss their planned schedule, time and
quality. Hence, there is the need to effectively and efficiently mitigate such
risks if staffs want to avoid the above problems. Many studies have been
accompanied for viewing the issues from different aspects. However, available IT
risk mitigation tools present many weaknesses and above all, they are few. 13
studies related to the risk management and mitigation in IT has been reviewed.
The studies have been reviewed based on the review method which is called
Systematic Literature Review (SLR). Based on this approach, all previous studies
related to this title have been studied systematically. Thus this paper aims to
do review existing risk tool from 2000 until 2015. Also, this paper presents the
results of the systematic reviews on the number of tools in IT risk mitigation.
The findings of this review indicate that existing tools for risk mitigation are
not completely effective and efficient in mitigating risk that occurs in IT
management. This review identifies existing risk mitigation prototypes tools
purpose, functionalities, architecture and limitations. The findings of this
review also discuss the purpose, functionalities, architecture, limitations of
the existing risk mitigation tool in IT Management. |
Keywords: |
Risk Mitigation, Risk Management, IT Management, IT Risk Tools, Risk Assessment
Tools, Systematic Literature Review |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
GEOMETRY ALGORITHM ON SKELETON IMAGE BASED SEMAPHORE GESTURE RECOGNITION |
Author: |
AERI RACHMAD, MUHAMMAD FUAD |
Abstract: |
Semaphore, a way of communicating remotely, usually practiced in scouting
activities. Information is delivered by gestures or movements using specific
tools such as flags, paddles or rods. Teacher and instructors are needed for
learning semaphore in conventional way as they will give examples and make
correction when such an error occured. Based on the practical need to provide an
alternative way to learn semaphore, this research proposes the use of geometry
algorithm to develop a semaphore gesture recognition based on skeleton images
that read from Kinect sensor. Euclidean distance and law cosines are two
formulas that applied to generate gesture parameters of each alphabet.
Recognition is achieved by comparing a pair of values of model and real-time
gesture. Accuracy of this system that have been measured using RMSE with 30° of
tolerance yields 90.76% for Alphabet and 88% for Word. |
Keywords: |
Semaphore Gesture, Geometry Algorithm, Kinect, Skeleton Image, Alphabet |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
INTELLIGENT DECISION SUPPORT SYSTEMS (IDSS) FOR MULTI-OBJECTIVE OPTIMIZATION
PROBLEMS AT SEA SECURITY INDONESIA |
Author: |
MASROERI |
Abstract: |
Indonesian maritime security issues is a complex problem therefore included in
the multi-objective problems, it is necessary for the proper method to overcome
these problems, one of them by developing intelligent decision support system
using methods Non dominated Sorting Genetic Algorithms II (NSGA II) and Fuzzy C
Means (FCM). NSGA-II method will produce a set of optimal solution candidates,
while the FCM method serves to minimize the set of optimal solutions. His study
aims to maximize the achievements of the coverage area and minimize operational
costs by considering the type of vessel, speed, range radar, endurance.
Optimization results using NSGA-II with the following parameters: population =
20, generation = 100, crossover probability = 75% and = 90% probability of
mutations produced as many as 125 candidates for the optimal solution.
Furthermore, the solution is in the cluster to minimize the prospective
solutions based on a predetermined point cluster into four, resulting in four
candidates for the best solution, that is: [1] Co = Rp. 4.741.205.798 and Ca =
1.915.083 Mil2; [2] Co = Rp. 3.997.582.228 and Ca = 1.560.672 Mil2; [3] Co = Rp.
4.802.314.832 and Ca = 1.962.895 Mil2; [4] Co = Rp. 4.939.637.487 and Ca =
1.982.564 Mil2. Results of Intelligent Decision Support System with
consideration of increasing security in sea area of Indonesia and optimize the
budget that has been set by the Government, the recommended solution is the
solution to the cluster of four with a combined value of patrol boats
[647755475625136306153116502] Cost = Rp. 4.939.637.487, Coverage = 1.982.564
Mil2 and distance = 0.006724. |
Keywords: |
IDSS, Multi-Objective Optimization Problems, NSGA-II, FCM |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
MULTI-BAND POWER SYSTEM STABILIZER MODEL FOR POWER FLOW OPTIMIZATION IN ORDER TO
IMPROVE POWER SYSTEM STABILITY |
Author: |
AGUS JAMAL, SLAMET SURIPTO, RAMADONI SYAHPUTRA |
Abstract: |
This paper presents a Multi-band Power System Stabilizer Model for power flow
optimization in order to improve power system stability. Power System Stabilizer
(PSS) is equipment that can be used to enhance the damping of power system
during low frequency oscillations. For large scale power systems comprising of
many interconnected machines, the PSS parameter tuning is a complex exercise due
to the presence of several poorly damped modes of oscillation. The problem is
further being complicated by continuous varied in power system operating
conditions. In the simultaneous tuning approach, exhaustive computational tools
are required to obtain optimum parameter settings for the PSS, while in the case
of sequential tuning, although the computational load is fewer, evaluating the
tuning sequence is an additional requirement. There is a further problem of
eigenvalue drift. This paper presents the multi-band PSS model for designing
robust power system stabilizers for a multi machine system. Simulations were
carried out using several fault tests at transmission line on a Two-Area
Multi-machine Power System. As a reference the PSS model, Delta w PSS and Delta
Pa PSS has been used for comparison with the PSS under considerations. The
result shows that power transfer response using the model is more robust than
Delta w PSS and Delta Pa PSS, especially for three phase faults and phase to
ground faults. |
Keywords: |
Power System Stabilizer, Multi-Band, Transient Stability, Oscillation,
Multi-Machine Power System. |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
ATTENUATION CORRECTION OF PET IMAGE RECONSTRUCTED BASED ON DIRECT FILTERING OF
THE RAW DATA ACQUIRED USING MCNPX CODE |
Author: |
M. SAEED, T. EL KHOUKHI, Y. BOULAICH, E. CHAKIR, H. BOUKHAL, T. EL BARDOUNI |
Abstract: |
This paper presents the Monte Carlo simulation of the Positron Emission
Tomography (PET) scanning technique using MCNPX code. The raw data generated by
general MCNPX code contain a lot of unnecessary information for the image
reconstruction process. This requires a large memory to store all this
information. In order to reduce the needed memory, we introduce some
modifications on the MCNPX code source in order to be able to write directly the
important data representing the coincidence events detected by the detector
pairs along the lines of response (LORs). These data are acquired as a result of
launching two simulations on a uniform cylindrical positron emitter source in
water-filled environment. Thereafter, from the object’s geometry, knowledge of
the materials in this object and the acquired simulation data, the attenuation
correction map has been calculated and applied to the acquired data during PET
image reconstruction process. |
Keywords: |
PET, MCNPX, Image reconstruction, Attenuation correction. |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
WEB-BASED GIS MAPPING FOR GEOTHERMAL RESOURCES POTENTIAL IN WEST SUMATRA |
Author: |
YUHENDRA |
Abstract: |
Geothermal energy is one of the renewable power sources which can be an
alternative to the increasingly more scarce fossil fuel. Especially, West-Sumatera
is one of region in Indonesia located at the ring of fire, has an abundant
potential for geothermal energies. The proposed of research address was
development of web-based mapping application and identifying potential areas for
geothermal exploration. We use web-based GIS (also called internet GIS or on
line GIS) application for standards software packages to develop an interactive
web mapping portal for spatial analysis. The resulting system as for resources
information system tools in decision support system (DSS) to helping and
managing the company’s geothermal resources for key decision making, planning
and also provides the users with an innovative and interactive way to access the
spatial content over the internet/intranet. The service application provides a
variety of functionalities such as; browse map mode, querying, navigation
system, user requiring service, search region and displaying geothermal well
properties e.g. megawatt capacity, map editing and printing from the web
interface. |
Keywords: |
West-Sumatra, Mapping, Web-based GIS, Decision Support System, Geothermal
Energy, Resources Information System |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
AN EFFICIENT GENTLE ADABOOST-BASED APPROACH FOR MAMMOGRAMS CLASSIFICATION |
Author: |
NEZHA HAMDI , KHALID AUHMANI , MOHA M’RABET HASSANI , OMAR ELKHARKI |
Abstract: |
In this work we have proposed and evaluated a new approach for classification of
mammograms. This approach is based on Gentle AdaBoost. Our main contribution in
this proposition is that the strong classifier is constructed by wheited weak
classifiers. These weak classifiers are extracted from the sub-bands of discrete
wavelet transform. We have used the Receiver Operating Curves (ROC) tool to
evaluate our proposition with mammograms of MIAS Database. We have presented
True positive rate versus false positive rate for different features types and
Gentle AdaBoost iteration number. Obtained results show that the best area under
curve (AUC), which represents the approach performance, is reached for Zernike
moments and it is equal to 0,98 and 0.99 for T=10 and T=50 respectively. The
other features types allow an AUC comprised between 0.7 and 0.76. Results show
also that performance of the proposed approach is slightly improved if the
number of iterations is increased. |
Keywords: |
Machine learning, Classification, Gentle AdaBoost, Discrete wavelet transform,
ROC, Feature Extraction. |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
COMPARISON OF SOFTWARE RELIABILITY ANALYSIS FOR BURR DISTRIBUTION |
Author: |
Dr. G. SRIDEVI, Dr. C.M. SHEELA RANI |
Abstract: |
This model is for Burr type distribution with three parameters which is
discussed in two versions - the Burr type III and Burr Type XII. In this paper,
we compare the performance of two versions of the suggested model is tested on
five real time software failure data sets. The versions perform with variable
accuracy, which suggest that no universal “best” among the two versions of the
model could be attained. |
Keywords: |
Burr type III Model; Burr type XII Model; NHPP; ML Estimation |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
CONCRETE CRACK DETECTION BASED MULTI-BLOCK CLBP FEATURES AND SVM CLASSIFIER |
Author: |
RGUIG MUSTAFA, EL AROUSSI MOHAMED |
Abstract: |
Recently Automatic concrete crack detection has been converted to a real
challenge for high performance of the inspection and diagnosis of concrete
structures images. Generally, there are various noises such as irregularly
illuminated conditions, shading and divots in the concrete images. Hence it is
difficult to detect cracks automatically. In this paper, a novel and efficient
approach based on Compound Local Binary Pattern (CLBP) using support vector
machines is proposed for automatic concrete crack detection. The contributions
of this paper include the following steps: (1) the proposed system starts by
pre-processing the database images, smoothing their texture and enhancing any
existing cracks, being followed by the extraction of descriptive features. Here
each image is divided into several non-overlapping blocks and each block
originates a feature vector. (2) The support vector machine (SVM) is
successfully applied to determine the concrete crack image classification. The
experimental results gave a 97.43% classification accuracy rate, which indicate
that the proposed method is a promising tool for analysis of concrete structures
images. |
Keywords: |
Concrete Crack Detection, Concrete Structure, Compound Local Binary Pattern,
Support Vector Machine, Feature Extraction, Local Binary Pattern. |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
Title: |
FUZZY KERNEL C-MEANS ALGORITHM FOR INTRUSION DETECTION SYSTEMS |
Author: |
ZUHERMAN RUSTAM, AINI SURI TALITA |
Abstract: |
Intrusion Detection Systems (IDS) are used as security management systems. There
are two approaches of IDS, Misuse Detection (knowledge-based intrusion
detection) and Anomaly Detection (behavior-based intrusion detection). Misuse
detection is performed by monitoring activities which is suspected as an
intrusion based on prior information about specific attacks. While anomaly
detection is based on the observation of the activity that is incompatible with
the acceptable behaviors in normal conditions and makes it possible to determine
new type of attacks in the system. Some Computational Intelligence models have
been developed to solve Intrusion Detection Systems problems such as Neural
Network and Neuro-Fuzzy methods. They are chosen because IDS involves large data
sets with several different features that can bring out negative effects on IDS
accuracy and its computational time. Naïve Bayes, Decision Tree (C4.5) and
Kernel Matrix Methods can be used to reduce the number of features at data sets.
We propose Fuzzy Kernel C-Means Algorithm as another method to solve IDS
problems that we claim provides better results while combined with Kernel Matrix
method to reduce the number of selected data features. |
Keywords: |
Data Features, Fuzzy C-Means, Intrusion Detection Systems, Kernel Matrix, Kernel
Method |
Source: |
Journal of Theoretical and Applied Information Technology
10th November 2015 -- Vol. 81. No. 1 -- 2015 |
Full
Text |
|
|
|