|
Submit Paper / Call for Papers
Journal receives papers in continuous flow and we will consider articles
from a wide range of Information Technology disciplines encompassing the most
basic research to the most innovative technologies. Please submit your papers
electronically to our submission system at http://jatit.org/submit_paper.php in
an MSWord, Pdf or compatible format so that they may be evaluated for
publication in the upcoming issue. This journal uses a blinded review process;
please remember to include all your personal identifiable information in the
manuscript before submitting it for review, we will edit the necessary
information at our side. Submissions to JATIT should be full research / review
papers (properly indicated below main title).
|
|
|
Journal of Theoretical and Applied Information Technology
Jnauary 2016 | Vol. 83 No.2 |
Title: |
A NOVEL FILTER BASED PARTITIONING DECISION TREE MODEL FOR REAL-TIME NETWORK
SECURITY |
Author: |
SITA RAMA MURTY PILLA, R KIRAN KUMAR, M SAILAJA |
Abstract: |
Due to the exponential rise of the network attacks and increasing development of
software tools and techniques for intrusion detection, the rule based intrusion
detection system has become an essential solution for real-time anomaly
detection. Basically, traditional data mining based intrusion detection methods
generate a large set of predefined patterns most of them are high false rate and
inaccurate. There is a need to optimize the real-time network attacks due to the
variation in new attack type, instance set and attributes. To address the issue
of high false rate and dynamic data integration, a new anomaly detection system
using data mining model has been proposed to find the real time DOS/DDOS
patterns by integrating network packets capturing from different systems on the
network and kdd99 dataset. This system generates intrusion patterns by
integrating the predefined attacks and new attacks as early as possible with low
false rate. Experimental results show that proposed dynamic model optimizes the
real-time true positive patterns with high accuracy compared to traditional
models. |
Keywords: |
DDOS, Attribute Selection, Decision Tree, Intrusion Detection, Kddcup 99
Intrusion Dataset |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
EDWARD SNOWDEN DISCLOSURES TURN THE FEARS OF SURVEILLANCE INTO REALITY: THE
IMPACT AND TRANSFORMATION IN INFORMATION SECURITY |
Author: |
FATIMETOU ZAHRA MOHAMED MAHMOUD, AKRAM M ZEKI |
Abstract: |
More than two years passed since the biggest event on information security and
privacy which is the disclosure of very sensitive documents on the National
Security Agency in United States. Those disclosures had a lot of resonance and
impact in term of discussions between people who supported the act and others
who consider it as crime and breach of trust. The fact, is that the impact was
huge not only regarding information technology but it has been extended to
economy and politics. This paper provide a holistic view and analyse of the
current situation of information technology security and privacy especially with
the lack and limited researches that have been done to study the transformation
in information security strategies, policies and law after Edward Snowden
disclosures and how those transformation had affect the technology, business and
politics in various countries. This paper describes in detail and explain how
the disclosures done by Edward Snowden has happen, What is the most dangerous
programs used by NSA to violate people privacy, The reaction of different
countries such as USA and Canada toward this revelations and the change and
transformation in policies, strategies and law regarding information security
and privacy. Furthermore, how this revelations gave affect the relationship and
make it more complicated between China and USA. In addition, many countries
focus in developing law to protect their citizens’ privacy and security from any
foreign surveillance. Moreover, how this revelations has been a lessons to NSA
to strengthen its inside protection, lessons to companies to not trust to any
internet company to store their business data and how to improve their security
systems. |
Keywords: |
Edward Snowden, NSA, Information security, PRISM, Business companies |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
HOW THE SMRTCARD MAKES THE CERTIFICATION VERIFICATION EASY |
Author: |
L.JAGAJEEVAN RAO, M.VENKATA RAO, T.VIJAYA SARADHI |
Abstract: |
The main idea of this paper is to create a paperless atmosphere without fraud
using the well-used Smart card technology which can not only reduce the effort
of maintaining the certificates but also used to create the technological
innovation in the Educational field with the help of ongoing trend. In this
paper, we first know about the basics of the Smart card and then we deploy those
basics to implement our proposed system. To incorporate the existing system in
to the proposed system we must follow some phases like Knowing requirements,
design, analysis, and implementation. Introducing these Smart cards in to the
Educational Field may avoid fraud and miscellaneous certificates. These pocket
sized cards will be unique and authenticated for each individual, easy to carry
and maintenance free. |
Keywords: |
Authentication, Smart card Technology, Application protocol data unit (APDU),
Public Key Infrastructure (PKI), Certificate Management Protocol (CMP), unique
Certificate Identification Number(UCIN) |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
PERFORMANCE EVALUTION OF ENERGY-EFFICIENT CLUSTERING ALGORITHMS IN WIRELESS
SENSOR NETWORKS |
Author: |
AZIZ MAHBOUB, MOUNIR ARIOUA, EL MOKHTAR EN-NAIMI, IMAD EZ-ZAZI |
Abstract: |
One of the wireless sensor network issues that affect heavily the network’s
lifetime is the energy limitation. Minimizing energy dissipation and maximizing
network lifetime in wireless sensor networks are important challenges in the
design of routing protocols for sensor networks. The reason why nowadays many
works are interested in WSN’s energy management, taking into account the
communications and the data routing algorithms. The clustering approach is one
of the techniques used to minimize energy consumption and improve the system’s
life duration. In this paper, we propose a study performance of LEACH, SEP and
DEEC protocols through a solid comparison of key performance parameters of
wireless sensor networks such as the instability and stability period, the
network lifetime, the number of cluster heads per round and the number of alive
nodes. We evaluate the technical capability of each protocol vis-à-vis the
studied parameters. |
Keywords: |
Wireless Sensor Network; Energy-efficiency; Clustering protocols; LEACH; DEEC;
SEP. |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
DATA INTEGRATION SYSTEM FOR RDF DATA SOURCES |
Author: |
YASSINE LAADIDI, MOHAMMED BAHAJ |
Abstract: |
Decision-making systems aim to transform the data stream circulating through the
organization to relevant information and knowledge published in the form of
dashboards and reports. The Semantic Web (SW) is full of data sources serialized
in various formats and extensions (e.g. RDF, OWL, XML, etc) and they are created
from scratch or by the transformation of other existing sources (e.g. relational
database), and so it became one of the most major data sources that can be used
to fulfill the analysis’ needs in a decision-making system. The OWL 2 ontology
language as a W3C recommendation is built on the RDF data model and used to
provide the means for defining and creating structured web ontologies. The
purpose of this paper is to propose a new architectural system to perform a data
integration process to populate an existing data warehouse using linked data
(i.e. data from semantic web) as sources. |
Keywords: |
Data Integration, ETL, OWL, RDF, Semantic Web, Ontology, Decision-making System,
Data Warehouse, Data Sources, Linked Data |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
ENERGY CONSUMPTION IN FAILURE OF BUSINESS PROCESS SERVICES |
Author: |
WARTIKA , HUSNI SASTRAMIHARDJA , KRIDANTO SURENDRO , IPING SUPRIANA S. |
Abstract: |
Business process is integral part of modern organization. Linkages and
interaction between resources and business process can have significant effect
on use of energy. Because policies of environmental regulations, then
organization needs to know and evaluate the performance of services business
process so that can minimize energy consumption. This service can cause failure.
This failure can increase intensity of resources utilization. This paper is used
design study case result, method of research are experiment and interview. By
knowing the growth of energy consumption in failure business process service,
organization can minimize impact on environment by doing recovery process on the
failure |
Keywords: |
Service, Business Process, Failure, Energy, Consumption |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
MULTI STAGE PHISHING EMAIL CLASSIFICATION |
Author: |
AMMAR YAHYA DAEEF, R. BADLISHAH AHMAD, YASMIN YACOB, NAIMAH YAAKOB, KU NURUL
FAZIRA KU AZIR |
Abstract: |
Phishing emails risk increases progressively, which pose a real threat to users
of computers, organizations and lead to significant financial losses. Fighting
zero day phishing emails using content based server side classifiers is
considered as the best method to detect such attacks. This technique which is
based on machine learning algorithms is trained by the set of phishing email
features and the statistical classifier is used on stream of email to detect the
class of fresh email received. The false positive rate (FPR) and false negative
rate (FNR) are critical factors for these classifiers and should be as small as
possible to increase the overall accuracy of the classifiers. Using the ham and
phishing data sets available, this paper focuses on reduction of false positive
rate (FPR), false negative rate (FNR), and increase the overall accuracy of the
proposed classification system. The multi stage phishing email detection system
(MSPEDS) shows very promising results compared with previous works in term of
FPR, FNR, and accuracy. |
Keywords: |
Phishing, Emails, Email Features, Machine Learning, Classifiers |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
TOWARD A NEW TREATMENT APPROACH OF LEARNING CONTENT IN CLOUD ERA |
Author: |
B. FAQIHI , N. DAOUDI , R. AJHOUN |
Abstract: |
Nowadays, Technological development of human beings is extremely related to
information. This is proved on the manners and areas that life was computerized
such as in trade, government services, medicine, education, learning
etc.Nevertheless, the fast development of designing information systems has
created several sub-systems in multiple contexts which are conceived by
different communities and totally dispersed geographically but all undertaking
the same area. Neither contents nor services that these subsystems are made are
certainly in the same technology environments. In our research, the learning
field goes through many key steps. Actually, new revolutionized practices were
implemented due to technology innovation, so transition from classical learning
towards distance learning or d-learning is more than possible, it’s desired.
Consequently, this phenomenon has created more opportunities for learners and
teachers but also several challenges; in many cases, the multitude of standards
hinders the learner migration from a learning environment to another, so it
hampers its learning development.In this paper, we will propose a framework of
interoperability based on three levels. Since we are interested in semantic
level, we propose a process of interoperability of learning content in the cloud
era based on a global ontology. Like recommendation systems, we will start our
process by acquisition, then validation and finally structuration of the
learning content. This structuration way will give to both actors of learning
environment a certain flexibility and access to other resources in Cloud
environment. The basic principle is to collect content, to enrich it and to make
it interoperable by using unified approach in star based on a comprehensive
ontology. Our work is a part of MADAR project which is Learning Architecture
Adapted to Mobile Technology |
Keywords: |
Learning Content, Interoperability, Semantic Interoperability, Structuring,
Ontology, MADAR Learning |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
RECONSTRUCTION OF THE HUMAN RETINAL BLOOD VESSELS BY FRACTAL INTERPOLATION |
Author: |
H. GUEDRI, J. MALEK, H. BELMABROUK |
Abstract: |
This paper presents the fractal interpolation method of the human retina image.
In the first part, we focus on the segmentation image with Skeletonization and
identify different types of pixels. Secondly, using Douglas-Peucker algorithm to
reduce the number of pixels in the image we try to keep a form close to the
original. then, we used fractal interpolation( IFS ) to decompress the encoded
image .To evaluate the image quality of the methodology using the peak
signal-to-noise (PSNR).The results obtained show that the method of Douglas-Peucker
reduces the size of the image from 92 to 96 percent and the PSNR values of
fractal interpolation are between 27 and 36.9 db. We conclude with fractal
interpolation can have a better quality image. |
Keywords: |
Douglas-Peuker Algorithm, Fractal Interpolation, Retinal Blood Vessel Image |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
APPLICABILITY OF AN ADAPTIVE HANDOVER MECHANISMON PRACTICAL 4G NETWORKS |
Author: |
BHASKAR S, DR G A E SATISH KUMAR, DR P RAMANA REDDY |
Abstract: |
The fourth generation (4G) and beyond networks define the future of mobile
communication. Handover management in 4G networks based on the IPv6 core is a
problem that exists. The co-channel interference in wireless channels effect
network and handover performance. In this paper the proposed Noise Resilient
Reduced Registration Time Care-of Mobile IP protocol is modelled considering
practical network conditions. A threshold based handover decision making
algorithm is incorporated in the proposed protocol. The results presented in
this paper clearly put forth the effects of the co-channel interference on
handover management and network performance in the presence of multiple users.
The comparative study prove that the proposed protocol is robust and minimizes
network performance degradation considering practical conditions. |
Keywords: |
Mobile Internet Protocol, Ipv6, Handover Latency, Mobility Management, Received
Signal Strength, Mobile Ipv6 (Mipv6), Co-Channel Interference, Handover Request
Rate. |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
INVESTIGATION OF MONEY LAUNDERING METHODS THROUGH CRYPTOCURRENCY |
Author: |
DIANA MERGENOVNA SAT, GRIGORY OLEGOVICH KRYLOV, KIRILL EVGENYEVICH, BEZVERBNYI,
ALEXANDER BORISOVICH KASATKIN, IVAN ALEKSANDROVICH KORNEV |
Abstract: |
The main issue of this work is to search of suspicious operations that were made
with the use of cryptocurrency. The set tasks: creation of a database from
received information; visualization of received results; analysis and
conclusions of received results. The object of the research is money laundering
and financing terrorism by means of cryptocurrency. Nowadays it is an actual
term for research as for countries cryptocurrency is a new way of payments and
each country decides differently how to deal with it. But new technologies
provide us new possibilities in our live (especially in anonymous transactions
as payments for goods and other purposes) and of course they can be used in
illegal activity such as money laundering and financing of terrorism. Besides
anonymity is one of the main features of cryptocurrency that helps to hide the
source of income. This is the problem for countries because they have to combat
such a threat as money laundering and financing of terrorism. So it is natural
to find out ways of searching suspicious operations that can be directed to
money laundering and financing of terrorism. |
Keywords: |
Bitcoin, Transaction, Bitcoin Address Of The Recipient (Addresser), Bitcoin
Address Of The Sender (Receiver), Anti-Money Laundering, Combating Financing Of
Terrorism, Financial Monitoring. |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
OPERATING SYSTEM INTEGRITY CHECK FRAMEWORK ALGORITHM FOR THREAT POSED BY
ROOTKITS |
Author: |
DAVID MUGENDI, PROF. WAWERU MWANGI (PHD) DR. MICHAEL KIMWELE |
Abstract: |
Kernel mode rootkits, KMRs have indeed gained considerable success as far as
blackhat society is concerned raising much alarm to systems and system
defenders. The danger posed by these rootkits has to some extent led to call for
universal attention on the means to handle and deal with them. Rootkits have by
far become more complicated and stealthy making it difficult to even detect
their presence in the system using their susceptible methods. Bearing in mind
the danger at hand posed by these rootkits to operating system and at large
other computer systems, getting crucial information from already compromised
system proves to be an uphill task. This thesis focused in addressing this
problem. It focused on various techniques such as intelligent algorithm using
neural networks technology to enable integrity checking for kernel mode rootkits.
The research conducted also has described to some extent operating systems e.g.
Linux kernel and some of the areas which are a common target by kernel rootkits.
Virtualization technology was also introduced to enable readers understand some
of the critical concepts. A number of requirements to be satisfied while
addressing this issue have been outlined. A framework to implement the model has
been set up to show how integrity check was achieved at the end of research. |
Keywords: |
Artificial Neural Network (ANN), Loadable kernel module (LKM), common object
file format (COFF), Kernel mode rootkits, (KMR), probability mass functions
(PMFs) |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
PERFORMANCE MEASUREMENT OF PROJECT MANAGEMENT BY USING FANP BALANCED SCORECARD |
Author: |
HERMAWAN, AHMAD FAUZI, MOHAMMAD ANSHARI |
Abstract: |
The Project management (PM) is the implementation of knowledge, processes,
tools, and techniques which is needed to manage business processes to create
unique product or service. The PM has complexly business process, for monitoring
performance of all PM activity be completely and align with business strategy is
needed high effort. For this reason, we propose method to measure project
performance by using Balanced ScoreCard (BSC) strategy, where keys performance
source of BSC is produced by Project Integration Management (PIM). Where the PIM
is one of the domains in PM to consolidate other domains activities, they are
cost, time, procurement, communication, and also risk management. Inorder to
compute Key Performance Indicator (KPI) in many perspective quadrants of BSC, we
need computation method to measure priority and weighting value of PIM
performance, on this study we use Fuzzy Analytical Network Processing (FANP) to
measure qualitative and quantitave by comparing weighting priority between many
KPI’s in BSC. Furthermore in testing phase, we use data from The SAP Rollout
Project in PT. Semen Indonesia Tbk., where from the final result we acquired
weighting priority in many perspectives of BSC quadrants to support decission
support system for this project. From this study, we have significant
contribution to determining strategy to assessment KPI’s in BSC perspectives
from the PM based on The Project Management Body Of Knowledge (PMBOK), and
method to implement FANP to measure weighting priority for this KPI’s by modify
input methode from quantitative values to qualitative grades. |
Keywords: |
Project Management(PM), Project Integration Management(PIM), Balanced
Scorecard(BSC), Fuzzy Analytic Network Process (FANP), Decission Support System
(DSS) |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
DESIGN AND IMPLEMENTATION OF EMBEDDED THIN CLIENT FOR VNC |
Author: |
SAMAN FATAHPOUR |
Abstract: |
In this paper, a general method to design and implement a thin client for VNC
(Virtual Network Computing) is presented. In order to connect the device to
network and monitor, a 16-bit network interface called WIZ830MJ is applied.
Moreover, a solution to design 24-bit graphic interface, using the SSD1963 chip
and a DAC (Digital-to-Analog Converter) known as ADV7125, is introduced which
provide users with a resolution up to 800X480 pixels. To enhance the efficiency
of the device, the translator software is designed between VNC server and client
which allows applying FastLZ compression Algorithm. By modifying the RFB (Remote
Frame Buffer) protocol, the designed translator avoids some problems such as
“TCP Window Full” and “TCP Retransmission” and reduces the network bandwidth
consumption. The device has no need to fast processor and operating system. It
has a good performance using an ARM Cortex-M4 processor with working frequency
of 100 MHz. This issue is important because VNC software needs an operating
system like LINUX, which itself requires a more robust processor and more
consumed energy than an embedded device. At the end of the present paper, the
designed device is tested in a laboratory and the results are presented. |
Keywords: |
Thin Client, Embedded Device, VNC, RFB Protocol, Virtual Network Computing |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
OPTICAL CHARACTER RECOGNITION TECHNIQUE ALGORITHMS |
Author: |
N. VENKATA RAO, DR. A.S.C.S.SASTRY, A.S.N.CHAKRAVARTHY, KALYANCHAKRAVARTHI P |
Abstract: |
In this paper, we present a new neural network (NN) based method for optical
character recognition (OCR) as well as handwritten character recognition (HCR).
Experimental results show that our proposed method achieves increased accuracy
in optical character recognition as well as handwritten character recognition.
We present through an overview of existing handwritten character recognition
techniques. All the algorithms describes more or less on their own. Handwritten
character recognition is a very popular and computationally expensive task; we
describe advanced approaches for handwritten character recognition. In the
present work, we would like to compare the most important once out of the
variety of advanced existing techniques, and we will systematize the techniques
by their characteristic considerations. It leads to the behaviour of the
algorithms reaches to the expected similarities. |
Keywords: |
OCR, HCR, Neural Network. Recognition Technique |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
IMPROVING MULTILEVEL THRESHOLDING ALGORITHM USING ULTRAFUZZINESS OPTIMIZATION
BASED ON TYPE-II GAUSSIAN FUZZY SETS |
Author: |
SHOFWATUL UYUN |
Abstract: |
Image thresholding is one of image processing techniques to help analyze the
next phase. Consequently, choosing a precise method in this step is
quite-essential. Image blurs and bad illumination are common constraints that
often influence the effectiveness of the thresholding method. Fuzzy sets is one
among other perceptions in scoring an image. Thus, various thresholding fuzzy
techniques have been developed to eliminate those constraints. This paper
proposes the improvement of multilevel thresholding techniques by using type II
fuzzy sets with the function of gaussian membership to access some objects at
mammogram to get fibroglandular tissue areas. The result shows that the proposed
technique has a very good achievement with the average score with
misclassification error parameter of 97.86%. This proves that the proposed
algorithm are able to function well to the image with low contrast level and
high unclearness level. |
Keywords: |
Multilevel Thresholding, Ultrafuzziness, Fuzzy Sets, Type II, Gaussian |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
CONCEPTUAL SIMILARITY MEASURE |
Author: |
LABRIJI AMINE, ABDELBAKI ISSAM, REDDAHI NABIL, ABDELOUHED NAMIR, ABOUDOU
ABDERRAOUF |
Abstract: |
The similarity is a problem that has been the subject of several research
projects, particularly in the field of semantic information retrieval; this
latter is based on ontologies for modeling knowledge, a similarity measure of
ontology concepts is thus necessary in the main phases of information retrieval
(Indexing, weighting, research, …).
We present through this article a similarity computing approach to arc based
ontology of concepts. We assessed the similarity values obtained with those of
the most used approaches, namely "The measure of Wu and Palmer" and "The measure
of Rada and al". It shows that our measure is beneficial and provides a solution
to the limits of existing approaches. |
Keywords: |
Ontologies, Semantic Web, Arcs Distances, Similarity Measure, Information
Retrieval. |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
DENIAL OF SERVICE LOG ANALYSIS USING DENSITY K-MEANS METHOD |
Author: |
ARDYMULYA ISWARDANI, IMAM RIADI |
Abstract: |
Denial of service attacks launched by flooding the data on the ftp server causes
the server to be unable to handle requests from legitimate users, one of the
techniques in detecting these attacks is by monitoring, but found several
problems including the difficulty in distinguishing the attack and with normal
data traffic. So that the necessary field studies of triage forensics to get a
vital information at the scene that is useful in supporting the overall digital
forensics investigation. Triage forensics begins with the log databases which
are then performed by using the grouping density k-means algorithm to produce
three levels of danger (low, medium and high).
Proposed density k-means algorithm using three groups that represent the level
of danger. The minimum value, medium, and maximum of the dataset as early
centroid, the data which has minimum distance to the centroid value specified
will join to form a cluster centroid. Data that has been joined in a cluster and
then evaluated the level of density (density) with its center (centroid) using
Davies-Bouldin index.
Results of clustering that has been done in the dataset resulted in three
clusters, but the level of danger which successfully identified only two, namely
the level of danger of medium and high, the value of DBI obtained 0.082,
indicates that the data used manifold homogeneous, results DBI obtained is also
influenced by the selection of the value of the centroid beginning clustering
process. |
Keywords: |
Clustering, Triage Forensic, Log, Analysis, Density K-Means |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
DYNAMIC PARTIAL PATH-LOSS COMPENSATION-BASED POWER CONTROL TECHNIQUE IN LTE-A
FEMTOCELL NETWORKS |
Author: |
SAWSAN ALI SAAD, MAHAMOD ISMAIL, ROSDIADEE NORDIN |
Abstract: |
With the femtocells being overlaid onto the macrocell in two-tier architecture,
that comprises a mixture of both planned and arbitrary deployed nodes. This
increases the data rate for indoor environment and fulfils the dream of high
speed wireless and mobile broadband services. However, this potential for
significant data rate growth could severely be diminished by cross and co-tier
interference problem especially in a dense femtocell deployment scenario. In
this paper, a dynamic power control scheme is proposed to mitigate the downlink
interference in order to reduce the outage probability of macro user equipment (MUE),
while maintaining good QoS for the home user equipment (HUE). The femtocell
adjusts the transmit power, which subject to HUE measurements. The minimum level
of transmit power is constrained to the target Signal to Interference and Noise
Ratio (SINR) of the HUE that satisfy the required Quality of Service (QoS) of
the HUE. The system level simulations confirm that, the proposed power control
scheme reduces the outage probability of the nearby MUEs up to 23%, compared to
fixed power setting, while maintaining the spectral efficiency of the HUEs.
Furthermore, the transmit power can be reduced by 50%, which leads to effective
power solution for the interference scenario. |
Keywords: |
Co/Cross-Tier Interference, Femtocells, Macrocells, Power Control, Heterogeneous
Network. |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
Title: |
HETEROGENEOUS CROWD-SOURCING AND DATA FUSION MODEL FOR DISASTER MANAGEMENT
SERVICES |
Author: |
GAURAV TRIPATHI, DHANANJAY SINGH |
Abstract: |
In the present study of heterogeneous crowd sourcing is a novel field which
shall bring marked improvement in the field of decision support system. At this
point of time, the world is evolving technologically and thus we have started
emphasizing on human brains as a specific physical sensors as far sensing
capabilities are concerned. There are many human beings who are actually Subject
Matters Experts (SME). We need to tap this potential of human brain and make
them as sensors for our decision support system. Thus transforming the humans in
to sensors is a new concept of getting data attributes which shall help in
decision support system. With humans evolving and being utilized as a sensor the
concept of crowd sourcing has evolved and the focus has now been shifted to new
paradigm of crowd sensing (C-SENSE) of events along with the traditional
sensors. This paper presents a conceptual communication mechanism for
heterogeneous crowd sourcing which is fusion a of hierarchical-IMS (ID-mapping
server) and data fusion model to support hand-in-hand for disaster management’s
services. |
Keywords: |
Crowd sourcing, Data Fusion, Heterogeneous sensors, ID-Mapping Server |
Source: |
Journal of Theoretical and Applied Information Technology
20th January 2016 -- Vol. 83. No. 2 -- 2015 |
Full
Text |
|
|
|