|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
March 2016 | Vol. 85 No.3 |
Title: |
IMAGE SMOOTHENING AND MORPHOLOGICAL OPERATORS BASED JPEG COMPRESSION |
Author: |
MARLAPALLI KRISHNA, G SRINIVAS, PRASAD REDDY PVGD |
Abstract: |
Over the past decades the magnitude of transmitted information through internet
has amplified exponentially. The considerable way to compress an image is
provided by image compression. JPEG is the core triumphant still image
compression for band width conservation. So images can be accumulated and
transmitted earlier. Mathematical morphology is an inventive mathematical theory
which can be used to route and appraise the images. In this paper, we intend an
innovative JPEG compression algorithm based on fundamental morphological
operators dilation and erosion and the other morphological operations which are
the amalgamation of the two basic operations. The planned JPEG algorithms shows
improved results compared to standard JPEG compressed data in terms of image
quality metrics like PSNR, MSE and encoded bits. The planned JPEG algorithms
augments speed while lessen memory necessities by reducing the encoded bits. The
reconstructed images after decompression are at par with the original image
data. |
Keywords: |
Image Compression, Morpholoigcal operators, Smoothening, PSNR, MSE |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
PERFORMANCE AND SCHEDULING OF HPC APPLICATIONS IN CLOUD |
Author: |
PROF. FADI FOUZ, ADNAN ABI SEN |
Abstract: |
Cloud computing is a new concept that started to widely spread with internet
connection services to be a strong complemented for many services and a window
for several companies and researchers to access to huge platforms with simple
cost and without need to use it directly. One of the most important platforms,
which most research centers or companies with a high performance computation (HPC),
need it. So, this paper concerns about one of the most important problems that
faces the application of this service and providing it through cloud. This
problem is scheduling jobs that are needed to be executed on HPC. This paper
proposed a new scheduling that takes in account the major factors related to any
work (cost, time, resources, and quality) to achieve better performance with
less cost and complete quality. Through experiments, we proofed that our
proposed approach can overcome many previous proposed approached. |
Keywords: |
Cloud Computing, Scheduling algorithm, High Performance Computer, Classifier |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
DISTRIBUTED ANALYSIS OF BIG PERSONAL DATA SETS WITH RESPECT TO PRIVACY |
Author: |
ALEKSEY PAKHOMOV, KONSTANTIN ZAYTSEV |
Abstract: |
Monitoring of cross-boundary financial operations routinely affects the
interests of governmental and credit organizations of foreign countries which
may not match the objectives of financial monitoring of investigation initiating
country. Therefore, cross-boundary information requests should better be done in
such a way to eliminate the possibility to identify objects of interest upon
requests filed to databases of those organizations. Usually, Zero-knowledge
proof technology is used. However, most scientific papers in this field are
oriented at abstract study of the efficiency of algorithms (protocols) of
interactive proof and assessment of their complexity, without solving concrete
applied tasks. The objective of this paper is to seek an approach to anonymous
comparisons of personal data and further use of that approach for financial
monitoring of large organizations and international scientific projects. To
solve the task set up, well-known protocols Fiat-Shamir, Guillou-Quisquater (GQ)
and Schnorr were analyzed. As the basic, Fiat-Shamir protocol was chosen; it was
modified by widening its use to the extent of checking of personal data storage
facts in a database. The solution offered differs from known ones by combining
Zero-knowledge proof technology with cryptologic transformation via hash
function and is the most exact compared to frequently used approaches in that
area and has reasonable productivity. Modified protocol was tested taking as an
example identification of international terrorists among the list of financial
operations parties. |
Keywords: |
Zero-knowledge Proof Password, Hash Function, Data Exchange Protocol,
Anti-fraud, Money Laundering, Terrorist Financing, Scientific Collaboration |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
APPLICABILITY OF PROCESS CAPABILITY ANALYSIS IN METRIFYING QUALITY OF SOFTWARE |
Author: |
POOJA JHA, K S PATNAIK |
Abstract: |
Defects occurrence during software development process is a common phenomenon
and its consequences in software organizations are inevitable. This can result
in software failure or delayed delivery of product. Defect removal procedure is
obviously an option but on the other hand it, it can affect the budget of
organization. Organizations are demanding timely delivery of defect free and
high quality software. Besides defect removal, defect prevention techniques must
be encouraged during development. It should be considered as an ongoing process
during development to enhance quality and testing efficiency. The paper is an
empirical work enhancing the relevance of some of process enhancing metrics.
These metrics are useful in monitoring the on-going software development and
meeting the quality standards. The paper explores the applicability of
capability analysis metrics for defect prevention and how it can be used as
technique for upgrading software quality. |
Keywords: |
Metrics, Organization Baselines, Process Capability Analysis, AD GoF tests,
Quality |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
A NEW MODIFICATION FOR MENEZES-VANSTONE ELLIPTIC CURVE CRYPTOSYSTEM |
Author: |
ZIAD E. DAWAHDEH , SHAHRUL N. YAAKOB, ROZMIE RAZIF BIN OTHMAN |
Abstract: |
Information security algorithms are widely used in the recent times to protect
data and messages over internet. Elliptic Curve Cryptography (ECC) is one of the
most efficient techniques that are used for this issue, because it is difficult
for the adversary to solve the elliptic curve discrete logarithm problem to know
the secret key that is used in encryption and decryption processes. A new
efficient method has been proposed in this paper to improve the Menezes-Vanstone
Elliptic Curve Cryptography (MVECC). This modification reduces the running time
needed for encryption and decryption processes compared with the original method
and another two methods. In the modified method, only addition and subtraction
operations are used, and no inversion or multiplication operations because it
consumes a long time comparing with addition and subtraction, and this makes the
proposed algorithm faster in computations and running time than the original and
other methods. Moreover, the modified method uses the hexadecimal ASCII value to
encode each character in the message before encryption, which makes the
algorithm more secure and complicated to resist the adversaries. |
Keywords: |
Elliptic Curve Cryptography, Menezes-Vanstone Elliptic Curve Cryptosystem,
Encryption, Decryption, Hexadecimal ASCII. |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
ENRICHING PROCESS OF ICE-CREAM RECOMMENDATION USING COMBINATORIAL RANKING OF AHP
AND MONTE CARLO AHP |
Author: |
AKASH RAMESHWAR LADDHA, RAHUL RAGHVENDRA JOSHI, Dr.PEETI MULAY |
Abstract: |
Diabetes disease is the curse to human life and it is one of the worst
nightmares of everyone’s life. The patient needs to keep count on the sugar
level each time when he/she consumes food, sugar free ice-creams are not spared
from this and are favorite dessert of every one. Very less systems exist which
can guide diabetic and non-diabetic persons about the contents of sugar free
ice-creams. Many methodologies exist to recommend the ice-creams which are based
on some traditional techniques like collaborative filtering and content
recommendations. But most of the time their results are not up to the mark and
they may wrongly recommend the ice-cream which may adversely disturb blood sugar
level of the person. So, proposed methods put forward an idea of recommending
sugar free ice-creams based on its ingredients like carbohydrates, fats,
proteins and dietary fibers, which play a vital role in affecting blood glucose
levels of diabetics and non-diabetics. The proposed system uses advanced
techniques like Analytic Hierarchy Process (AHP) and Monte Carlo AHP (MCAHP)
which can be powered with Goal programming and classification process. It is
observed that rankings obtained from these two techniques are same for ice
creams under consideration. This paper mainly focuses on the techniques of AHP
and MCAHP for the ranking of considered sugar free ice-creams. |
Keywords: |
Analytic Hierarchy Process (AHP), Sugar Free, Ice-creams, Monte Carlo AHP (MCAHP),
Ranking etc |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
APPLICATION IN PRACTICE AND OPTIMIZATION OF INDUSTRIAL INFORMATION SYSTEMS |
Author: |
LIDIYA ALEKSEEVNA BONDARENKO, AFANASY VLADIMIROVICH ZUBOV, VYACHESLAV BORISOVICH
ORLOV, VALENTINA ALEKSANDROVNA PETROVA,
NIKITA SERGEEVICH UGEGOV |
Abstract: |
The article is concerned with the research of the issues of the optimization and
practical application of industrial information systems in the management of
complex organizational systems. The article contains an example of the
application of the developed methods for the training of experts in the
application of industrial information systems for automated bookkeeping and
management accounting, calculation of salaries and personnel inventory,
implemented at the St. Petersburg State University. |
Keywords: |
Industrial Information Systems, Manufacturing Execution Systems, Enterprise
Resource Planning, Enterprise Resource Planning, On-Line Analytic Processing,
“1C:Enterprise 8” |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
CUSTOMIZED INSTRUCTIONAL PEDAGOGY IN LEARNING PROGRAMMING – PROPOSED MODEL |
Author: |
MUHAMMED YOUSOOF, MOHD SAPIYAN |
Abstract: |
Computer programming is a highly cognitive skill. It requires mastery of many
domains. But in reality many learners are not able to cope with the mental
demands required in learning programming. Thus it leads to rote learning and
memorization. There are many reasons for this situation. However one of the main
reasons is the nature of the novice learners who experience high cognitive load
while learning programming. Given the fact that the novice learners lack well
defined schema and the limitation of the working memory, the students could not
assimilate the knowledge required for learning. It is to be noted that some
learning support in the form of visualization may help in learning programming,
as teachers are always reminded that use of visual aids could enhance learning
in students. The effect of visualization in learning is not clearly tangible.
This paper address this issue by employing NASA TLX rating scale to measure the
cognitive load in learning programming using visualizations. The measurement of
cognitive load could help to understand the difficulty level of the learners.
The learners vary one another in terms of their learning style and capabilities
and hence the load experienced during learning programming may differ
significantly from one another in a same homogenous group. This paper will
propose a model to optimize the instruction to learners based on their
background profile and will employ neural network to optimize the instruction by
suggesting the best visualization tool for each learner. |
Keywords: |
Programming, Visualization, Cognitive Load, NASA TLX scale, Neural Network |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
PARALLEL IMPLEMENTATION OF APRIORI ALGORITHMS ON THE HADOOP-MAPREDUCE PLATFORM-
AN EVALUATION OF LITERATURE |
Author: |
A.L.SAYETH SAABITH , ELANKOVAN SUNDARARAJAN , AND AZURALIZA ABU BAKAR |
Abstract: |
Data mining is the extraction of useful, prognostic, interesting, and unknown
information from massive transaction databases and other repositories. Data
mining tools predict potential trends and actions, allowing various fields to
make proactive, knowledge-driven decisions. Recently, with the rapid growth of
information technology, the amount of data has exponentially increased in
various fields. Big data mostly comes from people’s day-to-day activities and
Internet-based companies. Mining frequent itemsets and association rule mining
(ARM) are well-analysed techniques for revealing attractive correlations among
variables in huge datasets. The Apriori algorithm is one of the most broadly
used algorithms in ARM, and it collects the itemsets that frequently occur in
order to discover association rules in massive datasets. The original Apriori
algorithm is for sequential (single node or computer) environments. This Apriori
algorithm has many drawbacks for processing huge datasets, such as that a single
machine’s memory, CPU and storage capacity are insufficient. Parallel and
distributed computing is the better solution to overcome the above problems.
Many researchers have parallelized the Apriori algorithm. This study performs a
survey on several well-enhanced and revised techniques for the parallel Apriori
algorithm in the Hadoop-MapReduce environment. The Hadoop-MapReduce framework is
a programming model that efficiently and effectively processes enormous
databases in parallel. It can handle large clusters of commodity hardware in a
reliable and fault-tolerant manner. This survey will provide an overall view of
the parallel Apriori algorithm implementation in the Hadoop-MapReduce
environment and briefly discuss the challenges and open issues of big data in
the cloud and Hadoop-MapReduce. Moreover, this survey will not only give overall
existing improved Apriori algorithm methods on Hadoop-MapReduce but also provide
future research direction for upcoming researchers. |
Keywords: |
Data mining big data ARM Hadoop-MapReduce Cloud Apriori |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
A NEW APPROACH TO DESIGN AN ATTRACTIVE GAME BASED LEARNING IN VARIOUS DOMAINS |
Author: |
LAMYAE BENNIS, SAID BENHLIMA |
Abstract: |
Serious Games has been extensively utilized in diverse domains such as the
military, education, marketing and advertising. In this article, we are
particularly interested in Serious Games (SG) for education, called Learning
Game (LG). Currently Learning Games consumers still suffer from the high prices
and complications of emerging and designing an effective learning game without
being a developer or an informatics designer. In addition there is a huge lack
of authoring tool which allows the generation of LG linked to various culture,
ethnicity, and language for example (our Moroccan culture, ethnicity and our
native language). Therefore the main aim of this research work is to develop and
design an authoring tool entitled (Serious Game Generator) “S.G.G” which
addresses all the problems above using the generic model DICE to ease the
conception of Game Based Learning (GBL) generated by this tool. |
Keywords: |
Serious Games, Learning Games, Design, Method DICE, Authoring Tool, Game Based
Learning |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
AN EFFICIENT TECHNIQUE USING LIFTING BASED 3-D DWT FOR BIO-MEDICAL IMAGE
COMPRESSION |
Author: |
M. KALAIARASI, T. VIGNESWARAN |
Abstract: |
Compression technique is significant in day to day scenario for smooth
transmission of data. By utilizing this technique the bandwidth utilization is
reduced. Even compression techniques help us for efficient memory utilization to
make overall data transmission better. Data size in different cases is quite
huge and difficult to send without compressing it. In the bio- medical arena, it
is applicable because of the large image size, but at the same time it is having
its own challenges in terms of data loss. While reconstructing the image, the
possibility of data loss or quality loss comes into picture. Though there are
many techniques which suggest lossless compression and decompression but still
refinement is required. There are techniques using discrete wavelet transform to
do the lossless image compression. The recent one is the Three-Dimensional
Discrete Wavelet Transform (3-D DWT). In this research work, a lifting based
Discrete Wavelet Transform architecture for three dimensional images is
presented. The proposed architecture has been implemented on Xilinx Virtex-6
Series Field-Programmable Gate Array (FPGA). Implementation results show the
efficiency of proposed system in terms of power consumption and operating
frequency. The proposed architecture of Discrete Wavelet Transform achieves a
maximum operating frequency of 298 MHz with a power consumption of 7 mW. |
Keywords: |
Discrete Wavelet Transform (DWT), Discrete Cosine Transform (DCT),
Peak-Signal-to-Noise Ratio (PSNR), Very-Large-Scale Integration (VLSI), Lifting
scheme. |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
K-TIER SEPARATION BASED ABSTRACTION REFINEMENT SCHEDULERS FOR PARALLEL JOB IN
MULTIPLE CLOUD CENTERS |
Author: |
MR.C.ANTONY, DR.C.CHANDRASEKAR, DR.S.NITHYA REKHA |
Abstract: |
Current approaches to enforce cloud computing for complex applications,
processed in remote data centers are based on parallel processing capabilities.
Under such approaches, parallel applications minimize the CPU utilization on the
cloud whenever communication and synchronization between different parallel
processes takes place. Also data center thus incur workloads in the cloud. A
better approach should take responsiveness as the top priority so as to minimize
the overhead at the data centers while assuring nontrivial effort from the side
of the data centers. But at the same time it not only increases the
communication cost but also results in the improper utilization of nodes in the
data center with poor responses to parallel workload with many data center in
the cloud. In this paper we propose an approach to improve the utilization of
nodes and also respond to parallel workload in an efficient manner. We propose
an approach called abstraction refinement scheduling for parallel job in
multiple cloud centers. Under our approach, the problem of large-scale
scheduling is solved using abstraction refinement schedulers. A challenging
issue is how to solve the problem of scheduling for K-tier separation such that
the scheduling is done fast with small abstraction refinements. On the basis of
abstraction refinement schedulers, different schedulers are developed from
various computing domains on simulated data centers. Extensive experiments are
conducted to validate our approach where the results are implemented with
CloudSim in JAVA on the parametric factors such as number of data centers, job
assigned, time intervals, memory rate and CPU Cycles for abstraction scheduler. |
Keywords: |
Parallel Processing, Communication, Synchronization, Parallel Workload,
Abstraction Refinement Scheduling |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
PILOT STUDY OF EHRS ACCEPTANCE IN JORDAN HOSPITALS BY UTAUT2 |
Author: |
MALIK BADER. ALAZZAM, ABD. SAMAD HASAN BASARI, ABDUL SAMAD SIBGHATULLAH, MOHAMAD
RAZIFF RAMLI, MUSTAFA MUSA JABER, MOHD HARIZ NAIM |
Abstract: |
Purpose: Electronic health records (EHRs) exchange improves hospital quality and
reduces health costs. However, few studies address the antecedent factors of
healthcare professionals’ intentions to use EHR system. We examine the factors
that effecting on EHRs acceptance by Unified Theory of Acceptance and Use o[f
Technology (UTAUT2) model, this is a new methodology for evaluating acceptance
of EHRs. We propose a theoretical model to explain the exercise behavior of
health care professionals’ to use an EHR system acceptance.
The goal from this study to investigate the factors that affect the acceptance
of electronic health records system by healthcare professionals. This study
applied in Jordanian hospitals which use EHR system. Our objectives to build a
clear vision of the factors that affect the user acceptance of the system by
pilot test to be the start of an in-depth study and expanded in the future,
based on a preliminary study
Methods: We conducted a pilot test survey in Jordan hospitals to collect data
from healthcare professionals who had experience using the EHR systems. A valid
sample of 22 responses from 70 questionnaires were collected for data analysis
to pilot test |
Keywords: |
EHRs, Healthcare Professionals, UTAUT1, UTAUT2, Healthcare Professionals,
Acceptance, Trust Factors |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
AN INVESTIGATION ON THE VIABILITY OF USING IOT FOR STUDENT SAFETY AND ATTENDANCE
MONITORING IN IRAQI PRIMARY SCHOOLS |
Author: |
AMMAR KHALEEL, SALMAN YUSSOF |
Abstract: |
In Iraq, many student abduction cases are reported due to the lack of safety
mechanisms and the lack of law enforcement. Educational institutions such as
primary schools are looking for a better mechanism to monitor the student
attendance so that the safety of the students can be better monitored.
Currently, student attendance in schools is done in a traditional manner where
the teachers will manually check and record the attendance of students in their
class. However, this traditional method has many drawbacks such as it can only
be taken at certain time interval and therefore cannot monitor students in real
time. The main aim of this study is to investigate the viability of using
Internet of Things (IoT) approach to monitor student attendance and their
presence in the school compound in real time in order to ensure their safety. A
quantitative data collection using questionnaire with 113 working staff from
Iraqi primary schools was conducted to identify the current challenges of
student monitoring and the viability of using IoT to address these challenges.
The results of the questionnaire analysis show that the use of IoT could improve
the safety environment of primary schools by being able to monitor student
attendance accurately in real time. |
Keywords: |
IoT, Primary Schools, Students, Attendance Monitoring, Safety. |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
ENTERPRISE RESOURCE PLANNING ADOPTION LIFECYCLE: A SYSTEMATIC LITERATURE REVIEW |
Author: |
MARINA HASSAN, MARZANAH A. JABAR, FATIMAH SIDI, YUSMADI YAH JUSOH, SALFARINA
HASSAN |
Abstract: |
Enterprise Resource Planning (ERP) system is a tool in managing the business
management function from operational, tactical and strategic management towards
the emergence of Information System (IS). The awareness of ERP technology is
slowly develops among industrial and non-industrial people. Needs of having the
ERP software is not properly discussed within the organization. Some does not
even know why they need ERP in their industry. Few in depths knowledge regarding
ERP will bring to failure of ERP adoption. The paper was conducted by following
systematic literature review methods. Literature search related to ERP
methodology reveal 345 papers. After further reading done, 133 papers were
identified as a key paper for the topics chosen. Obtaining details understanding
of ERP lifecycle, we identified 19 ERP lifecycle papers from the literature from
2009 to 2015. We do a tabulated table to check on ERP needs, objectives and
factors affecting the success of ERP adoption in order to get the expectations
usage of ERP. A conceptual model of ERP lifecycle will be developed to
understand the stages on how to choose a quality ERP software based on the
organization objectives. The paper includes the details stages in ERP adoption
lifecycle so that user will know specifically what they need to do before or
during implementing ERP software. This paper expected to bring awareness about
the ERP model itself and to acknowledge people that ERP model will lead to the
successful of ERP adoption in developing their business. |
Keywords: |
ERP models, ERP success factors, ERP adoption |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
PATTERN BASED EVALUATION FOR EXTRACTING PERSONALIZED PROFILES |
Author: |
P. SATHVIK NAGA SAI, DR .P. SIVA KUMAR |
Abstract: |
The customer server model, we display a point by point structural planning and
outline execution of PMSE. In this configuration, customer accumulates and
stores locally the explore data to secure protection, while substantial
undertakings, for instance, thought extraction, get ready, and re situating are
performed near PMSE server. PMSE altogether enhances the exactness contrasting
with the benchmark. In the event that any system presents for enhancing the
productivity of their relative system is being referred to examples also, travel
examples getting to. In this paper, we propose CPHC (Classification by Pattern
based Hierarchical Clustering), a semi-controlled grouping calculation that uses
an example based bunch chain of command as an immediate implies for order. All
preparation what's more, test occasions are initially bunched together utilizing
an occurrence driven example based progressive grouping calculation that permits
every case to "vote" for its agent size-2 designs in a way that adjusts
neighborhood design centrality and worldwide example interestingness. These
examples structure beginning groups and whatever is left of the bunch chain of
command is acquired by taking after a remarkable iterative group refinement
prepare that endeavors neighborhood data. The subsequent group progressive
system is then utilized straightforwardly to characterize test cases, taking out
the need to prepare a classifier on an upgraded preparing set. Our exploratory
results appear productive preparing of each inquiry improvement in preparing
data set. |
Keywords: |
PMSE, CPHC, Cluster Chain Of Command, Group Refinement, Semi-Directed
Characterization |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
STRENGTHENING USER AUTHENTICATION FOR BETTER PROTECTION OF MOBILE APPLICATION
SYSTEMS |
Author: |
KARTINI MOHAMED , FATIMAH SIDI ,MARZANAH A. JABAR , ISKANDAR ISHAK , NORAHANA
SALIMIN , NOR SAFWAN AMIRUL SALLEH , ABDUL QAIYUM HAMZAH , AHMAD DAHARI JARNO ,
MUHAMAD FAEEZ PAUZI |
Abstract: |
For most of us now, life is incomplete if living without mobile phones. This is
because mobile phones are like a necessity to many people nowadays. Statistics
have shown that more than seven billion people in the world are having these
devices in 2015. This also means 97% of the human world populations are actually
mobile phone users. Besides, more than 50% of the mobile phone users are using
smarts phones which are capable of downloading a lot of mobile application
systems (apps). It is estimated that more than 200 million apps are being
downloaded in 2007 and this number is believed to be growing. Unfortunately,
many of these apps involve the transfer of important and confidential personal
data or business information. How to ensure this sensitive information is well
protected from being stolen or misused by unauthorized parties? One of the ways
to secure this communication is to properly control the access to the system by
strengthening the user authentication. Thus, this paper focuses on one the
techniques to enhance the protections of mobile apps to prevent intrusions by
unpermitted users. The enhancement is focusing on improving the multi-factor
elements and the text ciphering technique of the user authentication. In this
study, random number and time are added in the existing text-based multifactor
user authentication. Besides, encryption and hash are used as the text ciphering
technique to improve the protection. To measure how secure the proposed
enhancement is, an independent testing body has been appointed to perform
Vulnerability Test and Functionality Test to the apps. If all these tests are
passed, it can be said that the proposed enhancement is strong enough to protect
the apps from being intruded. Based on the test results provided by the testing
body, CyberSecurity Malaysia, the apps has passed all the Vulnerability Test and
Functionality Test. This shows that the control of the access to these apps are
strong and able to prevent from being accessed by unpermitted users. This also
means the proposed enhancement is able to give better protections to ensure the
mobile apps can't be easily broken into by unauthorized mobile phone users. |
Keywords: |
User authentication, Data Protection, Data Transmission, Wireless Communication,
Mobile Application Systems |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
Title: |
DESIGN OF A MONITOR FOR DETECTING MONEY LAUNDERING AND TERRORIST FINANCING |
Author: |
TAMER HOSSAM HELMY, MOHAMED ZAKI, TAREK SALAH, KHALED BADRAN |
Abstract: |
Money laundering is a global problem that affects all countries to various
degrees. Although, many countries take benefits from money laundering, by
accepting the money from laundering but keeping the crime abroad, at the long
run, “money laundering attracts crime”. Criminals come to know a country, create
networks and eventually also locate their criminal activities there. Most
financial institutions have been implementing anti-money laundering solutions
(AML) to fight investment fraud. The key pillar of a strong Anti Money
Laundering system for any financial institution depends mainly on a well
designed and effective monitoring system. The main purpose of the Anti Money
Laundering transactions monitoring system is to identify potential suspicious
behaviors embedded in legitimate transactions. This paper presents a monitor
framework that uses various techniques to enhance the monitoring capabilities.
This framework is depending on rule base monitoring, behavior detection
monitoring, cluster monitoring and link analysis based monitoring. The monitor
detection processes are based on a money laundering deterministic finite
automaton that has been obtained from their corresponding regular expressions. |
Keywords: |
Anti Money Laundering system, Money laundering monitoring and detecting, Cycle
detection monitoring, Suspected Link monitoring |
Source: |
Journal of Theoretical and Applied Information Technology
31st March 2016 -- Vol. 85. No. 3 -- 2016 |
Full
Text |
|
|
|