Search results for: spatial data analysis tools.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14241

Search results for: spatial data analysis tools.

13851 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests

Authors: Julius Onyancha, Valentina Plekhanova

Abstract:

One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.

Keywords: Web log data, web user profile, user interest, noise web data learning, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1698
13850 Model Discovery and Validation for the Qsar Problem using Association Rule Mining

Authors: Luminita Dumitriu, Cristina Segal, Marian Craciun, Adina Cocu, Lucian P. Georgescu

Abstract:

There are several approaches in trying to solve the Quantitative 1Structure-Activity Relationship (QSAR) problem. These approaches are based either on statistical methods or on predictive data mining. Among the statistical methods, one should consider regression analysis, pattern recognition (such as cluster analysis, factor analysis and principal components analysis) or partial least squares. Predictive data mining techniques use either neural networks, or genetic programming, or neuro-fuzzy knowledge. These approaches have a low explanatory capability or non at all. This paper attempts to establish a new approach in solving QSAR problems using descriptive data mining. This way, the relationship between the chemical properties and the activity of a substance would be comprehensibly modeled.

Keywords: association rules, classification, data mining, Quantitative Structure - Activity Relationship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1755
13849 Self-Supervised Pretraining on Paired Sequences of fMRI Data for Transfer Learning to Brain Decoding Tasks

Authors: Sean Paulsen, Michael Casey

Abstract:

In this work, we present a self-supervised pretraining framework for transformers on functional Magnetic Resonance Imaging (fMRI) data. First, we pretrain our architecture on two self-supervised tasks simultaneously to teach the model a general understanding of the temporal and spatial dynamics of human auditory cortex during music listening. Our pretraining results are the first to suggest a synergistic effect of multitask training on fMRI data. Second, we finetune the pretrained models and train additional fresh models on a supervised fMRI classification task. We observe significantly improved accuracy on held-out runs with the finetuned models, which demonstrates the ability of our pretraining tasks to facilitate transfer learning. This work contributes to the growing body of literature on transformer architectures for pretraining and transfer learning with fMRI data, and serves as a proof of concept for our pretraining tasks and multitask pretraining on fMRI data.

Keywords: Transfer learning, fMRI, self-supervised, brain decoding, transformer, multitask training.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 88
13848 Solving Inhomogeneous Wave Equation Cauchy Problems using Homotopy Perturbation Method

Authors: Mohamed M. Mousa, Aidarkhan Kaltayev

Abstract:

In this paper, He-s homotopy perturbation method (HPM) is applied to spatial one and three spatial dimensional inhomogeneous wave equation Cauchy problems for obtaining exact solutions. HPM is used for analytic handling of these equations. The results reveal that the HPM is a very effective, convenient and quite accurate to such types of partial differential equations (PDEs).

Keywords: Homotopy perturbation method, Exact solution, Cauchy problem, inhomogeneous wave equation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1781
13847 Information Retrieval in Domain Specific Search Engine with Machine Learning Approaches

Authors: Shilpy Sharma

Abstract:

As the web continues to grow exponentially, the idea of crawling the entire web on a regular basis becomes less and less feasible, so the need to include information on specific domain, domain-specific search engines was proposed. As more information becomes available on the World Wide Web, it becomes more difficult to provide effective search tools for information access. Today, people access web information through two main kinds of search interfaces: Browsers (clicking and following hyperlinks) and Query Engines (queries in the form of a set of keywords showing the topic of interest) [2]. Better support is needed for expressing one's information need and returning high quality search results by web search tools. There appears to be a need for systems that do reasoning under uncertainty and are flexible enough to recover from the contradictions, inconsistencies, and irregularities that such reasoning involves. In a multi-view problem, the features of the domain can be partitioned into disjoint subsets (views) that are sufficient to learn the target concept. Semi-supervised, multi-view algorithms, which reduce the amount of labeled data required for learning, rely on the assumptions that the views are compatible and uncorrelated. This paper describes the use of semi-structured machine learning approach with Active learning for the “Domain Specific Search Engines". A domain-specific search engine is “An information access system that allows access to all the information on the web that is relevant to a particular domain. The proposed work shows that with the help of this approach relevant data can be extracted with the minimum queries fired by the user. It requires small number of labeled data and pool of unlabelled data on which the learning algorithm is applied to extract the required data.

Keywords: Search engines; machine learning, Informationretrieval, Active logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2059
13846 Imputation Technique for Feature Selection in Microarray Data Set

Authors: Younies Mahmoud, Mai Mabrouk, Elsayed Sallam

Abstract:

Analyzing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.

Keywords: DNA microarray, feature selection, missing data, bioinformatics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2751
13845 Estimation of Forest Fire Emission in Thailand by Using Remote Sensing Information

Authors: A. Junpen, S. Garivait, S. Bonnet, A. Pongpullponsak

Abstract:

The forest fires in Thailand are annual occurrence which is the cause of air pollutions. This study intended to estimate the emission from forest fire during 2005-2009 using MODerateresolution Imaging Spectro-radiometer (MODIS) sensor aboard the Terra and Aqua satellites, experimental data, and statistical data. The forest fire emission is estimated using equation established by Seiler and Crutzen in 1982. The spatial and temporal variation of forest fire emission is analyzed and displayed in the form of grid density map. From the satellite data analysis suggested between 2005 and 2009, the number of fire hotspots occurred 86,877 fire hotspots with a significant highest (more than 80% of fire hotspots) in the deciduous forest. The peak period of the forest fire is in January to May. The estimation on the emissions from forest fires during 2005 to 2009 indicated that the amount of CO, CO2, CH4, and N2O was about 3,133,845 tons, 47,610.337 tons, 204,905 tons, and 6,027 tons, respectively, or about 6,171,264 tons of CO2eq. They also emitted 256,132 tons of PM10. The year 2007 was found to be the year when the emissions were the largest. Annually, March is the period that has the maximum amount of forest fire emissions. The areas with high density of forest fire emission were the forests situated in the northern, the western, and the upper northeastern parts of the country.

Keywords: Emissions, Forest fire, Remote sensing information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2157
13844 A Comparative Study of Regional Climate Models and Global Coupled Models over Uttarakhand

Authors: Sudip Kumar Kundu, Charu Singh

Abstract:

As a great physiographic divide, the Himalayas affecting a large system of water and air circulation which helps to determine the climatic condition in the Indian subcontinent to the south and mid-Asian highlands to the north. It creates obstacles by defending chill continental air from north side into India in winter and also defends rain-bearing southwesterly monsoon to give up maximum precipitation in that area in monsoon season. Nowadays extreme weather conditions such as heavy precipitation, cloudburst, flash flood, landslide and extreme avalanches are the regular happening incidents in the region of North Western Himalayan (NWH). The present study has been planned to investigate the suitable model(s) to find out the rainfall pattern over that region. For this investigation, selected models from Coordinated Regional Climate Downscaling Experiment (CORDEX) and Coupled Model Intercomparison Project Phase 5 (CMIP5) has been utilized in a consistent framework for the period of 1976 to 2000 (historical). The ability of these driving models from CORDEX domain and CMIP5 has been examined according to their capability of the spatial distribution as well as time series plot of rainfall over NWH in the rainy season and compared with the ground-based Indian Meteorological Department (IMD) gridded rainfall data set. It is noted from the analysis that the models like MIROC5 and MPI-ESM-LR from the both CORDEX and CMIP5 provide the best spatial distribution of rainfall over NWH region. But the driving models from CORDEX underestimates the daily rainfall amount as compared to CMIP5 driving models as it is unable to capture daily rainfall data properly when it has been plotted for time series (TS) individually for the state of Uttarakhand (UK) and Himachal Pradesh (HP). So finally it can be said that the driving models from CMIP5 are better than CORDEX domain models to investigate the rainfall pattern over NWH region.

Keywords: Global warming, rainfall, CMIP5, CORDEX, North Western Himalayan region.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1023
13843 Sustainability Assessment of Agriculture and Biodiversity Issues through an Innovative Knowledge Mediation System Using Deliberation Support Tools and INTEGRAAL Method Based on Stakeholder Involvement

Authors: Ashiquer Rahman

Abstract:

The cutting edge knowledge mediation system called ‘ePLANETe’ provides a framework for building knowledge, tools, and methods for education, research, and sustainable practices, as well as the deliberative assessment support for Higher Education, Research Institutions, and elsewhere e.g., the collaborative learning and research on sustainability and biodiversity issues of territorial development sectors. The paper is to present the analytical perspective of the ‘ePLANETe’ concept and functionalities as an experimental platform for contributing to sustainability assessment. Now the ‘ePLANETe’ can be seen as experimentation of the challenges of “ICT for Green”. The digital technologies of ‘ePLANETe’ are exploited (i) to facilitate collaborative research, learning tools, and knowledge for sustainability challenges, and (ii) as deliberation support tools in pursuing of sustainability performance and practices in territorial governance, public policy, and business strategy, as well as in the higher education sectors itself. The paper investigates the dealing capacity of qualitative and quantitative assessment of agriculture sustainability through the stakeholder-based integrated assessment. Specifically, this paper focuses on integrating system methodologies with Deliberation Support Tools (DST) and INTEGRAAL method for collective assessment and decision-making in implementing regional plans. The report aims to identify the effective knowledge and tools to enable deliberations methodologies regarding practices on the sustainability of agriculture and biodiversity issues, societal responsibilities, and regional planning, concentrating on the question: “How to effectively mobilize resources (knowledge, tools, and methods) from different sources and at different scales regarding on agriculture and biodiversity issues to address sustainability challenges” that will create the scope for qualitative and quantitative assessments of sustainability as a new landmark of the agriculture sector.

Keywords: Biodiversity, Deliberation Support Tools, INTEGRAAL, stakeholder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 206
13842 Tidal Data Analysis using ANN

Authors: Ritu Vijay, Rekha Govil

Abstract:

The design of a complete expansion that allows for compact representation of certain relevant classes of signals is a central problem in signal processing applications. Achieving such a representation means knowing the signal features for the purpose of denoising, classification, interpolation and forecasting. Multilayer Neural Networks are relatively a new class of techniques that are mathematically proven to approximate any continuous function arbitrarily well. Radial Basis Function Networks, which make use of Gaussian activation function, are also shown to be a universal approximator. In this age of ever-increasing digitization in the storage, processing, analysis and communication of information, there are numerous examples of applications where one needs to construct a continuously defined function or numerical algorithm to approximate, represent and reconstruct the given discrete data of a signal. Many a times one wishes to manipulate the data in a way that requires information not included explicitly in the data, which is done through interpolation and/or extrapolation. Tidal data are a very perfect example of time series and many statistical techniques have been applied for tidal data analysis and representation. ANN is recent addition to such techniques. In the present paper we describe the time series representation capabilities of a special type of ANN- Radial Basis Function networks and present the results of tidal data representation using RBF. Tidal data analysis & representation is one of the important requirements in marine science for forecasting.

Keywords: ANN, RBF, Tidal Data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1624
13841 Complementing Assessment Processes with Standardized Tests: A Work in Progress

Authors: Amparo Camacho

Abstract:

ABET accredited programs must assess the development of student learning outcomes (SOs) in engineering programs. Different institutions implement different strategies for this assessment, and they are usually designed “in house.” This paper presents a proposal for including standardized tests to complement the ABET assessment model in an engineering college made up of six distinct engineering programs. The engineering college formulated a model of quality assurance in education to be implemented throughout the six engineering programs to regularly assess and evaluate the achievement of SOs in each program offered. The model uses diverse techniques and sources of data to assess student performance and to implement actions of improvement based on the results of this assessment. The model is called “Assessment Process Model” and it includes SOs A through K, as defined by ABET. SOs can be divided into two categories: “hard skills” and “professional skills” (soft skills). The first includes abilities, such as: applying knowledge of mathematics, science, and engineering and designing and conducting experiments, as well as analyzing and interpreting data. The second category, “professional skills”, includes communicating effectively, and understanding professional and ethnical responsibility. Within the Assessment Process Model, various tools were used to assess SOs, related to both “hard” as well as “soft” skills. The assessment tools designed included: rubrics, surveys, questionnaires, and portfolios. In addition to these instruments, the Engineering College decided to use tools that systematically gather consistent quantitative data. For this reason, an in-house exam was designed and implemented, based on the curriculum of each program. Even though this exam was administered during various academic periods, it is not currently considered standardized. In 2017, the Engineering College included three standardized tests: one to assess mathematical and scientific reasoning and two more to assess reading and writing abilities. With these exams, the college hopes to obtain complementary information that can help better measure the development of both hard and soft skills of students in the different engineering programs. In the first semester of 2017, the three exams were given to three sample groups of students from the six different engineering programs. Students in the sample groups were either from the first, fifth, and tenth semester cohorts. At the time of submission of this paper, the engineering college has descriptive statistical data and is working with various statisticians to have a more in-depth and detailed analysis of the sample group of students’ achievement on the three exams. The overall objective of including standardized exams in the assessment model is to identify more precisely the least developed SOs in order to define and implement educational strategies necessary for students to achieve them in each engineering program.

Keywords: Assessment, hard skills, soft skills, standardized tests.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 769
13840 2D Graphical Analysis of Wastewater Influent Capacity Time Series

Authors: Monika Chuchro, Maciej Dwornik

Abstract:

The extraction of meaningful information from image could be an alternative method for time series analysis. In this paper, we propose a graphical analysis of time series grouped into table with adjusted colour scale for numerical values. The advantages of this method are also discussed. The proposed method is easy to understand and is flexible to implement the standard methods of pattern recognition and verification, especially for noisy environmental data.

Keywords: graphical analysis, time series, seasonality, noisy environmental data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1421
13839 Kinematic Analysis of an Assistive Robotic Leg for Hemiplegic and Hemiparetic Patients

Authors: M.R. Safizadeh, M. Hussein, K. F. Samat, M.S. Che Kob, M.S. Yaacob, M.Z. Md Zain

Abstract:

The aim of this paper is to present the kinematic analysis and mechanism design of an assistive robotic leg for hemiplegic and hemiparetic patients. In this work, the priority is to design and develop the lightweight, effective and single driver mechanism on the basis of experimental hip and knee angles- data for walking speed of 1 km/h. A mechanism of cam-follower with three links is suggested for this purpose. The kinematic analysis is carried out and analysed using commercialized MATLAB software based on the prototype-s links sizes and kinematic relationships. In order to verify the kinematic analysis of the prototype, kinematic analysis data are compared with the experimental data. A good agreement between them proves that the anthropomorphic design of the lower extremity exoskeleton follows the human walking gait.

Keywords: Kinematic analysis, assistive robotic leg, lower extremity exoskeleton, cam-follower mechanism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1881
13838 Toward a Use of Ontology to Reinforcing Semantic Classification of Message Based On LSA

Authors: S. Lgarch, M. Khalidi Idrissi, S. Bennani

Abstract:

For best collaboration, Asynchronous tools and particularly the discussion forums are the most used thanks to their flexibility in terms of time. To convey only the messages that belong to a theme of interest of the tutor in order to help him during his tutoring work, use of a tool for classification of these messages is indispensable. For this we have proposed a semantics classification tool of messages of a discussion forum that is based on LSA (Latent Semantic Analysis), which includes a thesaurus to organize the vocabulary. Benefits offered by formal ontology can overcome the insufficiencies that a thesaurus generates during its use and encourage us then to use it in our semantic classifier. In this work we propose the use of some functionalities that a OWL ontology proposes. We then explain how functionalities like “ObjectProperty", "SubClassOf" and “Datatype" property make our classification more intelligent by way of integrating new terms. New terms found are generated based on the first terms introduced by tutor and semantic relations described by OWL formalism.

Keywords: Classification of messages, collaborative communication tools, discussion forum, e-learning, formal description, latente semantic analysis, ontology, owl, semantic relations, semantic web, thesaurus, tutoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590
13837 A Research on the Coordinated Development of Chengdu-Chongqing Economic Circle Under the Background of New Urbanization

Authors: Deng Tingting

Abstract:

The coordinated and integrated development of regions is an inevitable requirement for China to move towards high-quality sustainable development. As one of the regions with the best economic foundation and the strongest economic strength in the western China, it is a typical area with national importance and strong network connection characteristics in terms of the comprehensive effect of linking the inland hinterland and connecting the western and national urban networks. The integrated development of the Chengdu-Chongqing economic circle is of great strategic significance for the rapid and high-quality development of the western region. In the context of new urbanization, this paper takes 16 urban units within the economic circle as the research object, based on the 5-year panel data of population, regional economy and spatial construction and development from 2016 to 2020, using the entropy method and Theil index to analyze the three target layers, and cause analysis. The research shows that there are temporal and spatial differences in the Chengdu-Chongqing economic circle, and there are significant differences between the core city and the surrounding cities. Therefore, by reforming and innovating the regional coordinated development mechanism, breaking administrative barriers, and strengthening the "polar nucleus" radiation function to release the driving force for economic development, especially in the gully areas of economic development belts, will not only promote the coordinated development of internal regions, but also promote the coordinated and sustainable development of the western region and toward a high-quality development path.

Keywords: Chengdu-Chongqing economic circle, new urbanization, coordinated regional development, Theil Index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 152
13836 Online Control of Knitted Fabric Quality: Loop Length Control

Authors: Dariush Semnani, Mohammad Sheikhzadeh

Abstract:

Circular knitting machine makes the fabric with more than two knitting tools. Variation of yarn tension between different knitting tools causes different loop length of stitches duration knitting process. In this research, a new intelligent method is applied to control loop length of stitches in various tools based on ideal shape of stitches and real angle of stitches direction while different loop length of stitches causes stitches deformation and deviation those of angle. To measure deviation of stitch direction against variation of tensions, image processing technique was applied to pictures of different fabrics with constant front light. After that, the rate of deformation is translated to needed compensation of loop length cam degree to cure stitches deformation. A fuzzy control algorithm was applied to loop length modification in knitting tools. The presented method was experienced for different knitted fabrics of various structures and yarns. The results show that presented method is useable for control of loop length variation between different knitting tools based on stitch deformation for various knitted fabrics with different fabric structures, densities and yarn types.

Keywords: Circular knitting, Radon transformation, Knittedfabric, Regularity, Fuzzy control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3644
13835 International Marketing in Business Practice of Small and Medium-Sized Enterprises

Authors: K. Matušínská, Z. Bednarčík, M. Klepek

Abstract:

This paper examines international marketing in business practice of Czech exporting small and medium-sized enterprises (SMEs) with regard to the strategic perspectives. Research was focused on Czech exporting SMEs from Moravian- Silesia region and their behavior on international markets. For purpose of collecting data, a questionnaire was given to 262 SMEs involved in international business. Statistics utilized in this research included frequency, mean, percentage, and chi-square test. Data were analyzed by Statistical Package for the Social Sciences software. The research analysis disclosed that there is certain space for improvement in strategic marketing especially in a marketing research, perception of cultural and social differences, product adaptation and usage of marketing communication tools.

Keywords: International Marketing, Marketing Mix, Marketing Research, Small and Medium-sized Enterprises (SMEs), Strategic Marketing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2074
13834 Multi-Dimensional Concerns Mining for Web Applications via Concept-Analysis

Authors: Carlo Bellettini, Alessandro Marchetto, Andrea Trentini

Abstract:

Web applications have become very complex and crucial, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering), the scientific community has focused attention to Web applications design, development, analysis, and testing, by studying and proposing methodologies and tools. This paper proposes an approach to automatic multi-dimensional concern mining for Web Applications, based on concepts analysis, impact analysis, and token-based concern identification. This approach lets the user to analyse and traverse Web software relevant to a particular concern (concept, goal, purpose, etc.) via multi-dimensional separation of concerns, to document, understand and test Web applications. This technique was developed in the context of WAAT (Web Applications Analysis and Testing) project. A semi-automatic tool to support this technique is currently under development.

Keywords: Concepts Analysis, Concerns Mining, Multi-Dimensional Separation of Concerns, Impact Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1444
13833 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories

Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos

Abstract:

Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.

Keywords: Database, forensic genetics, genetic analysis, sample management, software solution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1127
13832 Analysis on the Development and Evolution of China's Territorial Spatial Planning

Authors: He YuanYan

Abstract:

In recent years, China has implemented the reform of land and space planning. As an important public policy, land and space planning plays a vital role in the construction and development of cities. Land and space planning throughout the country is in full swing, but there are still many disputes from all walks of life. The content, scope, and specific implementation process of land and space planning are also ambiguous, leading to the integration of multiple regulation problems such as unclear authority, unclear responsibilities, and poor planning results during the implementation of land and space planning. Therefore, it is necessary to sort out the development and evolution of domestic and foreign land space planning, clarify the problems and cruxes from the current situation of China's land space planning, and sort out the obstacles and countermeasures to the implementation of this policy, so as to deepen the understanding of the connotation of land space planning. It is of great practical significance for all planners to correctly understand and clarify the specific contents and methods of land space planning and to smoothly promote the implementation of land space planning at all levels.

Keywords: Territorial spatial planning, public policy, land space, overall planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 166
13831 Application of Data Envelopment Analysis to Assess Quality Management Efficiency

Authors: Chuen Tse Kuah, Kuan Yew Wong, Farzad Behrouzi

Abstract:

This paper is aimed to give an illustration on the application of Data Envelopment Analysis (DEA) as a tool to assess Quality Management (QM) efficiency. A variant of DEA, slack based measure (SBM) is used for this purpose. From this study, it is found that DEA is suitable to measure QM efficiency and give improvement suggestions to the inefficient QM.

Keywords: Quality Management, Data Envelopment Analysis, Slack Based Measure, Efficiency Measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2061
13830 Statistical Analysis of Interferon-γ for the Effectiveness of an Anti-Tuberculous Treatment

Authors: Shishen Xie, Yingda L. Xie

Abstract:

Tuberculosis (TB) is a potentially serious infectious disease that remains a health concern. The Interferon Gamma Release Assay (IGRA) is a blood test to find out if an individual is tuberculous positive or negative. This study applies statistical analysis to the clinical data of interferon-gamma levels of seventy-three subjects who diagnosed pulmonary TB in an anti-tuberculous treatment. Data analysis is performed to determine if there is a significant decline in interferon-gamma levels for the subjects during a period of six months, and to infer if the anti-tuberculous treatment is effective.

Keywords: Data analysis, interferon gamma release assay, statistical methods, tuberculosis infection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1923
13829 Urban Areas Management in Developing Countries: Analysis of the Urban Areas Crossed with Risk of Storm Water Drains, Aswan-Egypt

Authors: Omar Hamdy, Schichen Zhao, Hussein Abd El-Atty, Ayman Ragab, Muhammad Salem

Abstract:

One of the most risky areas in Aswan is Abouelreesh, which is suffering from flood disasters, as heavy deluge inundates urban areas causing considerable damage to buildings and infrastructure. Moreover, the main problem was the urban sprawl towards this risky area. This paper aims to identify the urban areas located in the risk areas prone to flash floods. Analyzing this phenomenon needs a lot of data to ensure satisfactory results; however, in this case the official data and field data were limited, and therefore, free sources of satellite data were used. This paper used ArcGIS tools to obtain the storm water drains network by analyzing DEM files. Additionally, historical imagery in Google Earth was studied to determine the age of each building. The last step was to overlay the urban area layer and the storm water drains layer to identify the vulnerable areas. The results of this study would be helpful to urban planners and government officials to make the disasters risk estimation and develop primary plans to recover the risky area, especially urban areas located in torrents.

Keywords: Risk area, DEM, storm water drains, GIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 835
13828 Digital Marketing Maturity Models: Overview and Comparison

Authors: Elina Bakhtieva

Abstract:

The variety of available digital tools, strategies and activities might confuse and disorient even an experienced marketer. This applies in particular to B2B companies, which are usually less flexible in uptaking of digital technology than B2C companies. B2B companies are lacking a framework that corresponds to the specifics of the B2B business, and which helps to evaluate a company’s capabilities and to choose an appropriate path. A B2B digital marketing maturity model helps to fill this gap. However, modern marketing offers no widely approved digital marketing maturity model, and thus, some marketing institutions provide their own tools. The purpose of this paper is building an optimized B2B digital marketing maturity model based on a SWOT (strengths, weaknesses, opportunities, and threats) analysis of existing models. The current study provides an analytical review of the existing digital marketing maturity models with open access. The results of the research are twofold. First, the provided SWOT analysis outlines the main advantages and disadvantages of existing models. Secondly, the strengths of existing digital marketing maturity models, helps to identify the main characteristics and the structure of an optimized B2B digital marketing maturity model. The research findings indicate that only one out of three analyzed models could be used as a separate tool. This study is among the first examining the use of maturity models in digital marketing. It helps businesses to choose between the existing digital marketing models, the most effective one. Moreover, it creates a base for future research on digital marketing maturity models. This study contributes to the emerging B2B digital marketing literature by providing a SWOT analysis of the existing digital marketing maturity models and suggesting a structure and main characteristics of an optimized B2B digital marketing maturity model.

Keywords: B2B digital marketing strategy, digital marketing, digital marketing maturity model, SWOT analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3260
13827 Using Business Intelligence Capabilities to Improve the Quality of Decision-Making: A Case Study of Mellat Bank

Authors: Jalal Haghighat Monfared, Zahra Akbari

Abstract:

Today, business executives need to have useful information to make better decisions. Banks have also been using information tools so that they can direct the decision-making process in order to achieve their desired goals by rapidly extracting information from sources with the help of business intelligence. The research seeks to investigate whether there is a relationship between the quality of decision making and the business intelligence capabilities of Mellat Bank. Each of the factors studied is divided into several components, and these and their relationships are measured by a questionnaire. The statistical population of this study consists of all managers and experts of Mellat Bank's General Departments (including 190 people) who use commercial intelligence reports. The sample size of this study was 123 randomly determined by statistical method. In this research, relevant statistical inference has been used for data analysis and hypothesis testing. In the first stage, using the Kolmogorov-Smirnov test, the normalization of the data was investigated and in the next stage, the construct validity of both variables and their resulting indexes were verified using confirmatory factor analysis. Finally, using the structural equation modeling and Pearson's correlation coefficient, the research hypotheses were tested. The results confirmed the existence of a positive relationship between decision quality and business intelligence capabilities in Mellat Bank. Among the various capabilities, including data quality, correlation with other systems, user access, flexibility and risk management support, the flexibility of the business intelligence system was the most correlated with the dependent variable of the present research. This shows that it is necessary for Mellat Bank to pay more attention to choose the required business intelligence systems with high flexibility in terms of the ability to submit custom formatted reports. Subsequently, the quality of data on business intelligence systems showed the strongest relationship with quality of decision making. Therefore, improving the quality of data, including the source of data internally or externally, the type of data in quantitative or qualitative terms, the credibility of the data and perceptions of who uses the business intelligence system, improves the quality of decision making in Mellat Bank.

Keywords: Business intelligence, business intelligence capability, decision making, decision quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1325
13826 Development and Characterization of Bio-Tribological, Nano-Multilayer Coatings for Medical Tools Application

Authors: L. Major, J. M. Lackner, M. Dyner, B. Major

Abstract:

Development of new generation bio-tribological, multilayer coatings opens an avenue for fabrication of future hightech functional surfaces. In the presented work, nano-composite, Cr/CrN+[Cr/ a-C:H implanted by metallic nanocrystals] multilayer coatings have been developed for surface protection of medical tools. Thin films were fabricated by a hybrid Pulsed Laser Deposition technique. Complex microstructure analysis of nanomultilayer coatings, subjected to mechanical and biological tests, were performed by means of transmission electron microscopy (TEM). Microstructure characterization revealed the layered arrangement of Cr23C6 nanoparticles in multilayer structure. Influence of deposition conditions on bio-tribological properties of the coatings was studied. The bio-tests were used as a screening tool for the analyzed nanomultilayer coatings before they could be deposited on medical tools. Bio-medical tests were done using fibroblasts. The mechanical properties of the coatings were investigated by means of a ball-ondisc mechanical test. The micro hardness was done using Berkovich indenter. The scratch adhesion test was done using Rockwell indenter. From the bio-tribological point of view, the optimal properties had the C106_1 material.

Keywords: Bio-tribological coatings, cell-material interaction, hybrid PLD, tribology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1975
13825 Design of Reconfigurable Parasitic Antenna for Single RF Chain MIMO Systems

Authors: C. Arunachalaperumal, B. Chandru, J. M. Mathana

Abstract:

In recent years parasitic antenna play major role in MIMO systems because of their gain and spectral efficiency. In this paper, single RF chain MIMO transmitter is designed using reconfigurable parasitic antenna. The Spatial Modulation (SM) is a recently proposed scheme in MIMO scenario which activates only one antenna at a time. The SM entirely avoids ICI and IAS, and only requires a single RF chain at the transmitter. This would switch ON a single transmit-antenna for data transmission while all the other antennas are kept silent. The purpose of the parasitic elements is to change the radiation pattern of the radio waves which is emitted from the driven element and directing them in one direction and hence introduces transmit diversity. Diode is connect between the patch and ground by changing its state (ON and OFF) the parasitic element act as reflector and director and also capable of steering azimuth and elevation angle. This can be achieved by changing the input impedance of each parasitic element through single RF chain. The switching of diode would select the single parasitic antenna for spatial modulation. This antenna is expected to achieve maximum gain with desired efficiency.

Keywords: MIMO system, single RF chain, Parasitic Antenna.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048
13824 CVOIP-FRU: Comprehensive VoIP Forensics Report Utility

Authors: Alejandro Villegas, Cihan Varol

Abstract:

Voice over Internet Protocol (VoIP) products is an emerging technology that can contain forensically important information for a criminal activity. Without having the user name and passwords, this forensically important information can still be gathered by the investigators. Although there are a few VoIP forensic investigative applications available in the literature, most of them are particularly designed to collect evidence from the Skype product. Therefore, in order to assist law enforcement with collecting forensically important information from variety of Betamax VoIP tools, CVOIP-FRU framework is developed. CVOIP-FRU provides a data gathering solution that retrieves usernames, contact lists, as well as call and SMS logs from Betamax VoIP products. It is a scripting utility that searches for data within the registry, logs and the user roaming profiles in Windows and Mac OSX operating systems. Subsequently, it parses the output into readable text and html formats. One superior way of CVOIP-FRU compared to the other applications that due to intelligent data filtering capabilities and cross platform scripting back end of CVOIP-FRU, it is expandable to include other VoIP solutions as well. Overall, this paper reveals the exploratory analysis performed in order to find the key data paths and locations, the development stages of the framework, and the empirical testing and quality assurance of CVOIP-FRU.

Keywords: Betamax, digital forensics, report utility, VoIP, VoIP Buster, VoIPWise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3095
13823 Determination of Critical Source Areas for Sediment Loss: Sarrath River Basin, Tunisia

Authors: Manel Mosbahi

Abstract:

The risk of water erosion is one of the main environmental concerns in the southern Mediterranean regions. Thus, quantification of soil loss is an important issue for soil and water conservation managers. The objective of this paper is to examine the applicability of the Soil and Water Assessment Tool (SWAT) model in The Sarrath river catchment, North of Tunisia, and to identify the most vulnerable areas in order to help manager implement an effective management program. The spatial analysis of the results shows that 7 % of the catchment experiences very high erosion risk, in need for suitable conservation measures to be adopted on a priority basis. The spatial distribution of erosion risk classes estimated 3% high, 5,4% tolerable, and 84,6% low. Among the 27 delineated subcatchments only 4 sub-catchments are found to be under high and very high soil loss group, two sub-catchments fell under moderate soil loss group, whereas other sub-catchments are under low soil loss group.

Keywords: Critical source areas, Erosion risk, SWAT model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1431
13822 Study of Qualitative and Quantitative Metric for Pixel Factor Mapping and Extended Pixel Mapping Method

Authors: Indradip Banerjee, Souvik Bhattacharyya, Gautam Sanyal

Abstract:

In this paper, an approach is presented to investigate the performance of Pixel Factor Mapping (PFM) and Extended PMM (Pixel Mapping Method) through the qualitative and quantitative approach. These methods are tested against a number of well-known image similarity metrics and statistical distribution techniques. The PFM has been performed in spatial domain as well as frequency domain and the Extended PMM has also been performed in spatial domain through large set of images available in the internet.

Keywords: Qualitative, quantitative, PFM, EXTENDED PMM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1038