Search results for: data communication security.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8968

Search results for: data communication security.

7768 The Design of English Materials to Communicate the Identity of Mueang District, Samut Songkram for Ecotourism

Authors: Kitda Praraththajariya

Abstract:

The main purpose of this research was to study how to communicate the identity of the Mueang district, SamutSongkram province for ecotourism. The qualitative data was collected through studying related materials, exploring the area, in-depth interviews with three groups of people: three directly responsible officers who were key informants of the district, twenty foreign tourists and five Thai tourist guides. A content analysis was used to analyze the qualitative data. The two main findings of the study were as follows: 1. The identity of Amphur (District) Mueang, SamutSongkram province. This establishment was near the Mouth of Maekong River for normal people and tourists, consisting of rest accommodations. There are restaurants where food and drinks are served, rich mangrove forests, Hoy Lod (Razor Clam) and mangrove trees. Don Hoy Lod, is characterized by muddy beaches, is a coastal wetland for Ramsar Site. It is at 1099th ranging where the greatest number of Hoy Lod (Razor Clam) can be seen from March to May each year. 2. The communication of the identity of AmphurMueang, SamutSongkram province which the researcher could find and design to present in English materials can be summed up in 4 items: 1) The history of AmphurMueang, SamutSongkram province 2) WatPhetSamutWorrawihan 3) The Learning source of Ecotourism: Don Hoy Lod and Mangrove forest 4) How to keep AmphurMueang, SamutSongkram province for ecotourism.

Keywords: Foreigner tourists, signified, semiotics, ecotourism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1834
7767 Hybrid Authentication System Using QR Code with OTP

Authors: Salim Istyaq

Abstract:

As we know, number of Internet users are increasing drastically. Now, people are using different online services provided by banks, colleges/schools, hospitals, online utility, bill payment and online shopping sites. To access online services, text-based authentication system is in use. The text-based authentication scheme faces some drawbacks with usability and security issues that bring troubles to users. The core element of computational trust is identity. The aim of the paper is to make the system more compliable for the imposters and more reliable for the users, by using the graphical authentication approach. In this paper, we are using the more powerful tool of encoding the options in graphical QR format and also there will be the acknowledgment which will send to the user’s mobile for final verification. The main methodology depends upon the encryption option and final verification by confirming a set of pass phrase on the legal users, the outcome of the result is very powerful as it only gives the result at once when the process is successfully done. All processes are cross linked serially as the output of the 1st process, is the input of the 2nd and so on. The system is a combination of recognition and pure recall based technique. Presented scheme is useful for devices like PDAs, iPod, phone etc. which are more handy and convenient to use than traditional desktop computer systems.

Keywords: Graphical Password, OTP, QR Codes, Recognition based graphical user authentication, usability and security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636
7766 Pilot-Assisted Direct-Current Biased Optical Orthogonal Frequency Division Multiplexing Visible Light Communication System

Authors: Ayad A. Abdulkafi, Shahir F. Nawaf, Mohammed K. Hussein, Ibrahim K. Sileh, Fouad A. Abdulkafi

Abstract:

Visible light communication (VLC) is a new approach of optical wireless communication proposed to support the congested radio frequency (RF) spectrum. VLC systems are combined with orthogonal frequency division multiplexing (OFDM) to achieve high rate transmission and high spectral efficiency. In this paper, we investigate the Pilot-Assisted Channel Estimation for DC biased Optical OFDM (PACE-DCO-OFDM) systems to reduce the effects of the distortion on the transmitted signal. Least-square (LS) and linear minimum mean-squared error (LMMSE) estimators are implemented in MATLAB/Simulink to enhance the bit-error-rate (BER) of PACE-DCO-OFDM. Results show that DCO-OFDM system based on PACE scheme has achieved better BER performance compared to conventional system without pilot assisted channel estimation. Simulation results show that the proposed PACE-DCO-OFDM based on LMMSE algorithm can more accurately estimate the channel and achieves better BER performance when compared to the LS based PACE-DCO-OFDM and the traditional system without PACE. For the same signal to noise ratio (SNR) of 25 dB, the achieved BER is about 5×10-4 for LMMSE-PACE and 4.2×10-3 with LS-PACE while it is about 2×10-1 for system without PACE scheme.

Keywords: Channel estimation, OFDM, pilot-assist, VLC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 645
7765 Speech Encryption and Decryption Using Linear Feedback Shift Register (LFSR)

Authors: Tin Lai Win, Nant Christina Kyaw

Abstract:

This paper is taken into consideration the problem of cryptanalysis of stream ciphers. There is some attempts need to improve the existing attacks on stream cipher and to make an attempt to distinguish the portions of cipher text obtained by the encryption of plain text in which some parts of the text are random and the rest are non-random. This paper presents a tutorial introduction to symmetric cryptography. The basic information theoretic and computational properties of classic and modern cryptographic systems are presented, followed by an examination of the application of cryptography to the security of VoIP system in computer networks using LFSR algorithm. The implementation program will be developed Java 2. LFSR algorithm is appropriate for the encryption and decryption of online streaming data, e.g. VoIP (voice chatting over IP). This paper is implemented the encryption module of speech signals to cipher text and decryption module of cipher text to speech signals.

Keywords: Linear Feedback Shift Register.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3086
7764 Blind Non-Minimum Phase Channel Identification Using 3rd and 4th Order Cumulants

Authors: S. Safi, A. Zeroual

Abstract:

In this paper we propose a family of algorithms based on 3rd and 4th order cumulants for blind single-input single-output (SISO) Non-Minimum Phase (NMP) Finite Impulse Response (FIR) channel estimation driven by non-Gaussian signal. The input signal represents the signal used in 10GBASE-T (or IEEE 802.3an-2006) as a Tomlinson-Harashima Precoded (THP) version of random Pulse-Amplitude Modulation with 16 discrete levels (PAM-16). The proposed algorithms are tested using three non-minimum phase channel for different Signal-to-Noise Ratios (SNR) and for different data input length. Numerical simulation results are presented to illustrate the performance of the proposed algorithms.

Keywords: Higher Order Cumulants, Channel identification, Ethernet communication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1425
7763 Risk Management and Security Practice in Customs Supply Chain: Application of Cross ABC Method to the Moroccan Customs

Authors: Lamia Hammadi, Abdellah Ait Ouhman, Aomar Ibourk

Abstract:

It is widely assumed that the case of Customs Supply Chain is classified as a complex system, due to not only the variety and large number of actors, but also their complex structural links, and the interactions between these actors, that’s why this system is subject to various types of Risks. The economic, political and social impacts of those risks are highly detrimental to countries, businesses and the public, for this reason, Risk management in the customs supply chain is becoming a crucial issue to ensure the sustainability, security and safety. The main characteristic of customs risk management approach is determining which goods and means of transport should be examined? To what extend? And where future compliance resources should be directed? The purposes of this article are, firstly to deal with the concept of customs supply chain, secondly present our risk management approach based on Cross Activity Based Costing (ABC) Method as an interactive tool to support decision making in customs risk management. Finally, analysis of case study of Moroccan customs to putting theory into practice and will thus draw together the various elements of a structured and efficient risk management approach.

Keywords: Cross ABC Method, Customs Supply Chain, Risk, Risk Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3442
7762 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-

Authors: Nieto Bernal Wilson, Carmona Suarez Edgar

Abstract:

The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects.  Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.

Keywords: Data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458
7761 Automatic Light Control in Domotics using Artificial Neural Networks

Authors: Carlos Machado, José A. Mendes

Abstract:

Home Automation is a field that, among other subjects, is concerned with the comfort, security and energy requirements of private homes. The configuration of automatic functions in this type of houses is not always simple to its inhabitants requiring the initial setup and regular adjustments. In this work, the ubiquitous computing system vision is used, where the users- action patterns are captured, recorded and used to create the contextawareness that allows the self-configuration of the home automation system. The system will try to free the users from setup adjustments as the home tries to adapt to its inhabitants- real habits. In this paper it is described a completely automated process to determine the light state and act on them, taking in account the users- daily habits. Artificial Neural Network (ANN) is used as a pattern recognition method, classifying for each moment the light state. The work presented uses data from a real house where a family is actually living.

Keywords: ANN, Home Automation, Neural Systems, PatternRecognition, Ubiquitous Computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2042
7760 Distributed Data-Mining by Probability-Based Patterns

Authors: M. Kargar, F. Gharbalchi

Abstract:

In this paper a new method is suggested for distributed data-mining by the probability patterns. These patterns use decision trees and decision graphs. The patterns are cared to be valid, novel, useful, and understandable. Considering a set of functions, the system reaches to a good pattern or better objectives. By using the suggested method we will be able to extract the useful information from massive and multi-relational data bases.

Keywords: Data-mining, Decision tree, Decision graph, Pattern, Relationship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1533
7759 K-Means for Spherical Clusters with Large Variance in Sizes

Authors: A. M. Fahim, G. Saake, A. M. Salem, F. A. Torkey, M. A. Ramadan

Abstract:

Data clustering is an important data exploration technique with many applications in data mining. The k-means algorithm is well known for its efficiency in clustering large data sets. However, this algorithm is suitable for spherical shaped clusters of similar sizes and densities. The quality of the resulting clusters decreases when the data set contains spherical shaped with large variance in sizes. In this paper, we introduce a competent procedure to overcome this problem. The proposed method is based on shifting the center of the large cluster toward the small cluster, and recomputing the membership of small cluster points, the experimental results reveal that the proposed algorithm produces satisfactory results.

Keywords: K-Means, Data Clustering, Cluster Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3259
7758 Representing Data without Lost Compression Properties in Time Series: A Review

Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Keywords: Compression properties, uncertainty, uncertain time series, mining technique, weather prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1601
7757 Are XBRL-based Financial Reports Better than Non-XBRL Reports? A Quality Assessment

Authors: Zhenkun Wang, Simon S. Gao

Abstract:

Using a scoring system, this paper provides a comparative assessment of the quality of data between XBRL formatted financial reports and non-XBRL financial reports. It shows a major improvement in the quality of data of XBRL formatted financial reports. Although XBRL formatted financial reports do not show much advantage in the quality at the beginning, XBRL financial reports lately display a large improvement in the quality of data in almost all aspects. With the improved XBRL web data managing, presentation and analysis applications, XBRL formatted financial reports have a much better accessibility, are more accurate and better in timeliness.

Keywords: Data Quality; Financial Report; Information; XBRL

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2532
7756 Performance Analysis of Multiuser Diversity in Multiuser Two-Hop Decode-and-Forward Cooperative Multi-Relay Wireless Networks

Authors: Mamoun F. Al-Mistarihi, Rami Mohaisen

Abstract:

Cooperative diversity (CD) has been adopted in many communication systems because it helps in improving performance of the wireless communication systems with the help of the relays that emulate the multiple antenna terminals. This work aims to provide the derivation of the performance analysis expressions of the multiuser diversity (MUD) in the two-hop cooperative multi-relay wireless networks (TCMRNs). Considering the work analysis, we provide analytically the derivation of a closed form expression of the two most commonly used performance metrics namely, the outage probability and the symbol error probability (SEP) for the fixed decode-and-forward (FDF) protocol with MUD.

Keywords: Cooperative diversity (CD), fixed decode-andforward(FDF), multiuser diversity (MUD) , two - hop cooperative multi-relay wireless networks (TCMRN).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1513
7755 Enhancement of Performance Utilizing Low Complexity Switched Beam Antenna

Authors: P. Chaipanya, R. Keawchai, W. Sombatsanongkhun, S. Jantaramporn

Abstract:

To manage the demand of wireless communication that has been dramatically increased, switched beam antenna in smart antenna system is focused. Implementation of switched beam antennas at mobile terminals such as notebook or mobile handset is a preferable choice to increase the performance of the wireless communication systems. This paper proposes the low complexity switched beam antenna using single element of antenna which is suitable to implement at mobile terminal. Main beam direction is switched by changing the positions of short circuit on the radiating patch. There are four cases of switching that provide four different directions of main beam. Moreover, the performance in terms of Signal to Interference Ratio when utilizing the proposed antenna is compared with the one using omni-directional antenna to confirm the performance improvable.

Keywords: Switched beam, shorted circuit, single element, signal to interference ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1333
7754 Modeling of Random Variable with Digital Probability Hyper Digraph: Data-Oriented Approach

Authors: A. Habibizad Navin, M. Naghian Fesharaki, M. Mirnia, M. Kargar

Abstract:

In this paper we introduce Digital Probability Hyper Digraph for modeling random variable as the hierarchical data-oriented model.

Keywords: Data-Oriented Models, Data Structure, DigitalProbability Hyper Digraph, Random Variable, Statistic andProbability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1245
7753 Fade Dynamics Investigation Applying Statistics of Fade Duration and Level Crossing Rate

Authors: Balázs Héder, Róbert Singliar, János Bitó

Abstract:

The impact of rain attenuation on wireless communication signals is predominant because of the used high frequency (above 10 GHz). The knowledge of statistics of attenuation is very important for planning point-to-point microwave links operating in high frequency band. Describing the statistics of attenuation is possible for instance with fade duration or level crossing rate. In our examination we determine these statistics from one year measured data for a given microwave link, and we are going to make an attempt to transform the level crossing rate statistic to fade duration statistic.

Keywords: Rain attenuation measurement, fade duration, level crossing rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1642
7752 Registration Management System for the First Access to a Public Moroccan Institution: Case Sultan Moulay Slimane University, Beni Mellal

Authors: Khalid Ghoulam, Belaid Bouikhalene, Zakaria Harmouch, Hicham Mouncif

Abstract:

One of the essential topics in the information systems is the registration management. The objective of this project is to create a web portal designed to help new students on the first access to the Sultan Moulay Slimane University SMSU (Practical Information, Pre-Registration, Placement Test, Terms of use ... etc.) while creating a secure space protecting both data from the institutions of the University and student information. This portal is accessible from any computer connected to the Internet inside and outside the campus. In this work, we present a platform on the first access to the SMSU which is essential for authentication in the digital work space of the university. This platform allows university to make better decisions for students clustering, to avoid traditional manual method, and to reduce the cost in human and material resources.

Keywords: Registration, SMSU, Security, FAUSMS, digital work space, Placement test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1106
7751 Study of Efficiency and Capability LZW++ Technique in Data Compression

Authors: Yusof. Mohd Kamir, Mat Deris. Mohd Sufian, Abidin. Ahmad Faisal Amri

Abstract:

The purpose of this paper is to show efficiency and capability LZWµ in data compression. The LZWµ technique is enhancement from existing LZW technique. The modification the existing LZW is needed to produce LZWµ technique. LZW read one by one character at one time. Differ with LZWµ technique, where the LZWµ read three characters at one time. This paper focuses on data compression and tested efficiency and capability LZWµ by different data format such as doc type, pdf type and text type. Several experiments have been done by different types of data format. The results shows LZWµ technique is better compared to existing LZW technique in term of file size.

Keywords: Data Compression, Huffman Encoding, LZW, LZWµ, RLL, Size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2064
7750 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: Hit rate, Locality of program, Stack cache, and Stack data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1487
7749 Cross Project Software Fault Prediction at Design Phase

Authors: Pradeep Singh, Shrish Verma

Abstract:

Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. Earlier we predicted the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven datasets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.

Keywords: Software Metrics, Fault prediction, Cross project, Within project.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2511
7748 Power System Damping Using Hierarchical Fuzzy Multi- Input PSS and Communication Lines Active Power Deviations Input and SVC

Authors: Mohammad Hasan Raouf, Ahmad Rouhani, Mohammad Abedini, Ebrahim Rasooli Anarmarzi

Abstract:

In this paper the application of a hierarchical fuzzy system (HFS) based on MPSS and SVC in multi-machine environment is studied. Also the effect of communication lines active power variance signal between two ΔPTie-line regions, as one of the inputs of hierarchical fuzzy multi-input PSS and SVC (HFMPSS & SVC), on the increase of low frequency oscillation damping is examined. In the MPSS, to have better efficiency an auxiliary signal of reactive power deviation (ΔQ) is added with ΔP+ Δω input type PSS. The number of rules grows exponentially with the number of variables in a classic fuzzy system. To reduce the number of rules the HFS consists of a number of low-dimensional fuzzy systems in a hierarchical structure. Phasor model of SVC is described and used in this paper. The performances of MPSS and ΔPTie-line based HFMPSS and also the proposed method in damping inter-area mode of oscillation are examined in response to disturbances. The efficiency of the proposed model is examined by simulating a four-machine power system. Results show that the proposed method is performing satisfactorily within the whole range of disturbances and reduces the cost of system.

Keywords: Communication lines active power variance signal, Hierarchical fuzzy system (HFS), Multi-input power system stabilizer (MPSS), Static VAR compensator (SVC).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1648
7747 Secure Secret Recovery by using Weighted Personal Entropy

Authors: Leau Y. B., Dinna Nina M. N., Habeeb S. A. H., Jetol B.

Abstract:

Authentication plays a vital role in many secure systems. Most of these systems require user to log in with his or her secret password or pass phrase before entering it. This is to ensure all the valuables information is kept confidential guaranteeing also its integrity and availability. However, to achieve this goal, users are required to memorize high entropy passwords or pass phrases. Unfortunately, this sometimes causes difficulty for user to remember meaningless strings of data. This paper presents a new scheme which assigns a weight to each personal question given to the user in revealing the encrypted secrets or password. Concentration of this scheme is to offer fault tolerance to users by allowing them to forget the specific password to a subset of questions and still recover the secret and achieve successful authentication. Comparison on level of security for weight-based and weightless secret recovery scheme is also discussed. The paper concludes with the few areas that requires more investigation in this research.

Keywords: Secret Recovery, Personal Entropy, Cryptography, Secret Sharing and Key Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1943
7746 Video Quality Control Using a ROI and Two- Component Weighted Metrics

Authors: Petra Heribanová, Jaroslav Polec, Michal Martinovič

Abstract:

In this paper we propose a new content-weighted method for full reference (FR) video quality control using a region of interest (ROI) and wherein two-component weighted metrics for Deaf People Video Communication. In our approach, an image is partitioned into region of interest and into region "dry-as-dust", then region of interest is partitioned into two parts: edges and background (smooth regions), while the another methods (metrics) combined and weighted three or more parts as edges, edges errors, texture, smooth regions, blur, block distance etc. as we proposed. Using another idea that different image regions from deaf people video communication have different perceptual significance relative to quality. Intensity edges certainly contain considerable image information and are perceptually significant.

Keywords: Video quality assessment, weighted MSE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1959
7745 Extreme Temperature Forecast in Mbonge, Cameroon through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the Generalized extreme value(GEV) distribution to analyse temperature data from the Cameroon Development Corporation (C.D.C). By considering two sets of data (Raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data while in the simulated data, the return values show an increasing trend but with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend but with an upper bound. This clearly shows that temperatures in the tropics even-though show a sign of increasing in the future, there is a maximum temperature at which there is no exceedence. The results of this paper are very vital in Agricultural and Environmental research.

Keywords: Return level, Generalized extreme value (GEV), Meteorology, Forecasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2079
7744 Evaluating the Feasibility of Magnetic Induction to Cross an Air-Water Boundary

Authors: Mark Watson, J.-F. Bousquet, Adam Forget

Abstract:

A magnetic induction based underwater communication link is evaluated using an analytical model and a custom Finite-Difference Time-Domain (FDTD) simulation tool. The analytical model is based on the Sommerfeld integral, and a full-wave simulation tool evaluates Maxwell’s equations using the FDTD method in cylindrical coordinates. The analytical model and FDTD simulation tool are then compared and used to predict the system performance for various transmitter depths and optimum frequencies of operation. To this end, the system bandwidth, signal to noise ratio, and the magnitude of the induced voltage are used to estimate the expected channel capacity. The models show that in seawater, a relatively low-power and small coils may be capable of obtaining a throughput of 40 to 300 kbps, for the case where a transmitter is at depths of 1 to 3 m and a receiver is at a height of 1 m.

Keywords: Magnetic Induction, FDTD, Underwater Communication, Sommerfeld.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 538
7743 Mining Multicity Urban Data for Sustainable Population Relocation

Authors: Xu Du, Aparna S. Varde

Abstract:

In this research, we propose to conduct diagnostic and predictive analysis about the key factors and consequences of urban population relocation. To achieve this goal, urban simulation models extract the urban development trends as land use change patterns from a variety of data sources. The results are treated as part of urban big data with other information such as population change and economic conditions. Multiple data mining methods are deployed on this data to analyze nonlinear relationships between parameters. The result determines the driving force of population relocation with respect to urban sprawl and urban sustainability and their related parameters. This work sets the stage for developing a comprehensive urban simulation model for catering to specific questions by targeted users. It contributes towards achieving sustainability as a whole.

Keywords: Data Mining, Environmental Modeling, Sustainability, Urban Planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1767
7742 An Ant-based Clustering System for Knowledge Discovery in DNA Chip Analysis Data

Authors: Minsoo Lee, Yun-mi Kim, Yearn Jeong Kim, Yoon-kyung Lee, Hyejung Yoon

Abstract:

Biological data has several characteristics that strongly differentiate it from typical business data. It is much more complex, usually large in size, and continuously changes. Until recently business data has been the main target for discovering trends, patterns or future expectations. However, with the recent rise in biotechnology, the powerful technology that was used for analyzing business data is now being applied to biological data. With the advanced technology at hand, the main trend in biological research is rapidly changing from structural DNA analysis to understanding cellular functions of the DNA sequences. DNA chips are now being used to perform experiments and DNA analysis processes are being used by researchers. Clustering is one of the important processes used for grouping together similar entities. There are many clustering algorithms such as hierarchical clustering, self-organizing maps, K-means clustering and so on. In this paper, we propose a clustering algorithm that imitates the ecosystem taking into account the features of biological data. We implemented the system using an Ant-Colony clustering algorithm. The system decides the number of clusters automatically. The system processes the input biological data, runs the Ant-Colony algorithm, draws the Topic Map, assigns clusters to the genes and displays the output. We tested the algorithm with a test data of 100 to1000 genes and 24 samples and show promising results for applying this algorithm to clustering DNA chip data.

Keywords: Ant colony system, biological data, clustering, DNA chip.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1952
7741 The Resource Description Framework (RDF) as a Modern Structure for Medical Data

Authors: Gabriela Lindemann, Danilo Schmidt, Thomas Schrader, Dietmar Keune

Abstract:

The amount and heterogeneity of data in biomedical research, notably in interdisciplinary fields, requires new methods for the collection, presentation and analysis of information. Important data from laboratory experiments as well as patient trials are available but come out of distributed resources. The Charité - University Hospital Berlin has established together with the German Research Foundation (DFG) a new information service centre for kidney diseases and transplantation (Open European Nephrology Science Centre - OpEN.SC). Beside a collaborative aspect to create new research groups every single partner or institution of this science information centre making his own data available is allowed to search the whole data pool of the various involved centres. A core task is the implementation of a non-restricting open data structure for the various different data sources. We decided to use a modern RDF model and in a first phase transformed original data coming from the web-based Electronic Patient Record database TBase©.

Keywords: Medical databases, Resource Description Framework (RDF), metadata repository.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2009
7740 IEEE 802.11 b and g WLAN Propagation Model using Power Density Measurements at ESPOL

Authors: E. E. Mantilla, C. R. Reyes, B. G. Ramos

Abstract:

This paper describes the development of a WLAN propagation model, using Spectral Analyzer measurements. The signal is generated by two Access Points (APs) on the base floor at the administrative Communication School of ESPOL building. In general, users do not have a Q&S reference about a wireless network; however, this depends on the level signal as a function of frequency, distance and other path conditions between receiver and transmitter. Then, power density of the signal decrease as it propagates through space and data transfer rate is affected. This document evaluates and implements empirical mathematical formulation for the characterization of WLAN radio wave propagation on two aisles of the building base floor.

Keywords: frequency, Spectral Analyzer, transmitter, WLAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1987
7739 Comparison of BER Performances for Conventional and Non-Conventional Mapping Schemes Used in OFDM

Authors: Riddhi Parmar, Shilpi Gupta, Upena Dalal

Abstract:

Orthogonal Frequency Division Multiplexing (OFDM) is one of the techniques for high speed data rate communication with main consideration for 4G and 5G systems. In OFDM, there are several mapping schemes which provide a way of parallel transmission. In this paper, comparisons of mapping schemes used by some standards have been made and also has been discussed about the performance of the non-conventional modulation technique. The Comparisons of Bit Error Rate (BER) performances for conventional and non-conventional modulation schemes have been done using MATLAB software. Mentioned schemes used in OFDM system can be selected on the basis of the requirement of power or spectrum efficiency and BER analysis.

Keywords: BER, π/4 differential quadrature phase shift keying (Pi/4 DQPSK), OFDM, phase shift keying, quadrature phase shift keying.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3105