Search results for: Statistical tool.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2772

Search results for: Statistical tool.

552 Combining the Deep Neural Network with the K-Means for Traffic Accident Prediction

Authors: Celso L. Fernando, Toshio Yoshii, Takahiro Tsubota

Abstract:

Understanding the causes of a road accident and predicting their occurrence is key to prevent deaths and serious injuries from road accident events. Traditional statistical methods such as the Poisson and the Logistics regressions have been used to find the association of the traffic environmental factors with the accident occurred; recently, an artificial neural network, ANN, a computational technique that learns from historical data to make a more accurate prediction, has emerged. Although the ability to make accurate predictions, the ANN has difficulty dealing with highly unbalanced attribute patterns distribution in the training dataset; in such circumstances, the ANN treats the minority group as noise. However, in the real world data, the minority group is often the group of interest; e.g., in the road traffic accident data, the events of the accident are the group of interest. This study proposes a combination of the k-means with the ANN to improve the predictive ability of the neural network model by alleviating the effect of the unbalanced distribution of the attribute patterns in the training dataset. The results show that the proposed method improves the ability of the neural network to make a prediction on a highly unbalanced distributed attribute patterns dataset; however, on an even distributed attribute patterns dataset, the proposed method performs almost like a standard neural network. 

Keywords: Accident risks estimation, artificial neural network, deep learning, K-mean, road safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 918
551 Durability Study Partially Saturated Fly Ash Blended Cement Concrete

Authors: N. Shafiq, M. F. Nuruddin, S. C. Chin

Abstract:

This paper presents the experimental results of the investigation of various properties related to the durability and longterm performance of mortars made of Fly Ash blended cement, FA and Ordinary Portland cement, OPC. The properties that were investigated in an experimental program include; equilibration of specimen in different relative humidity, determination of total porosity, compressive strength, chloride permeability index, and electrical resistivity. Fly Ash blended cement mortar specimens exhibited 10% to 15% lower porosity when measured at equilibrium conditions in different relative humidities as compared to the specimens made of OPC mortar, which resulted in 6% to 8% higher compressive strength of FA blended cement mortar specimens. The effects of ambient relative humidity during sample equilibration on porosity and strength development were also studied. For specimens equilibrated in higher relative humidity conditions, such as 75%, the total porosity of different mortar specimens was between 35% to 50% less than the porosity of samples equilibrated in 12% relative humidity, consequently leading to higher compressive strengths of these specimens.A valid statistical correlation between values of compressive strength, porosity and the degree of saturation was obtained. Measured values of chloride permeability index of fly ash blended cement mortar were obtained as one fourth to one sixth of those measured for OPC mortar specimens, which indicates high resistance against chloride ion penetration in FA blended cement specimens, hence resulting in a highly durable mortar.

Keywords: chloride permeability index, equilibrium condition, electrical resistivity, fly ash

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
550 Residual Dipolar Couplings in NMR Spectroscopy Using Lanthanide Tags

Authors: Elias Akoury

Abstract:

Nuclear Magnetic Resonance (NMR) spectroscopy is an indispensable technique used in structure determination of small and macromolecules to study their physical properties, elucidation of characteristic interactions, dynamics and thermodynamic processes. Quantum mechanics defines the theoretical description of NMR spectroscopy and treatment of the dynamics of nuclear spin systems. The phenomenon of residual dipolar coupling (RDCs) has become a routine tool for accurate structure determination by providing global orientation information of magnetic dipole-dipole interaction vectors within a common reference frame. This offers accessibility of distance-independent angular information and insights to local relaxation. The measurement of RDCs requires an anisotropic orientation medium for the molecules to partially align along the magnetic field. This can be achieved by introduction of liquid crystals or attaching a paramagnetic center. Although anisotropic paramagnetic tags continue to mark achievements in the biomolecular NMR of large proteins, its application in small organic molecules remains unspread. Here, we propose a strategy for the synthesis of a lanthanide tag and the measurement of RDCs in organic molecules using paramagnetic lanthanide complexes.

Keywords: Lanthanide Tags, NMR spectroscopy, residual dipolar coupling, quantum mechanics of spin dynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 967
549 Half Model Testing for Canard of a Hybrid Buoyant Aircraft

Authors: A. U. Haque, W. Asrar, A. A. Omar, E. Sulaeman, J. S. Mohamed Ali

Abstract:

Due to the interference effects, the intrinsic aerodynamic parameters obtained from the individual component testing are always fundamentally different than those obtained for complete model testing. Consideration and limitation for such testing need to be taken into account in any design work related to the component buildup method. In this paper, the scaled model of a straight rectangular canard of a hybrid buoyant aircraft is tested at 50 m/s in IIUM-LSWT (Low Speed Wind Tunnel). Model and its attachment with the balance are kept rigid to have results free from the aeroelastic distortion. Based on the velocity profile of the test section’s floor; the height of the model is kept equal to the corresponding boundary layer displacement. Balance measurements provide valuable but limited information of overall aerodynamic behavior of the model. Zero lift coefficient is obtained at -2.2o and the corresponding drag coefficient was found to be less than that at zero angle of attack. As a part of the validation of low fidelity tool, plot of lift coefficient plot was verified by the experimental data and except the value of zero lift coefficients, the overall trend has under predicted the lift coefficient. Based on this comparative study, a correction factor of 1.36 is proposed for lift curve slope obtained from the panel method.

Keywords: Wind tunnel testing, boundary layer displacement, lift curve slope, canard, aerodynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2602
548 Data-driven Multiscale Tsallis Complexity: Application to EEG Analysis

Authors: Young-Seok Choi

Abstract:

This work proposes a data-driven multiscale based quantitative measures to reveal the underlying complexity of electroencephalogram (EEG), applying to a rodent model of hypoxic-ischemic brain injury and recovery. Motivated by that real EEG recording is nonlinear and non-stationary over different frequencies or scales, there is a need of more suitable approach over the conventional single scale based tools for analyzing the EEG data. Here, we present a new framework of complexity measures considering changing dynamics over multiple oscillatory scales. The proposed multiscale complexity is obtained by calculating entropies of the probability distributions of the intrinsic mode functions extracted by the empirical mode decomposition (EMD) of EEG. To quantify EEG recording of a rat model of hypoxic-ischemic brain injury following cardiac arrest, the multiscale version of Tsallis entropy is examined. To validate the proposed complexity measure, actual EEG recordings from rats (n=9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Experimental results demonstrate that the use of the multiscale Tsallis entropy leads to better discrimination of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective metric as a prognostic tool.

Keywords: Electroencephalogram (EEG), multiscale complexity, empirical mode decomposition, Tsallis entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2047
547 Study of Optical Properties of a Glutathione Capped Gold Nanoparticles Using Linker (MHDA) by Fourier Transform Infra Red Spectroscopy and Surface Enhanced Raman Scattering

Authors: A. Deręgowska, J. Depciuch, R. Wojnarowska, J. Polit, D. Broda, H. Nechai, M. Gonchar, and E. Sheregii

Abstract:

16-Mercaptohexadecanoic acid (MHDA) and tripeptide glutathione conjugated with gold nanoparticles (Au-NPs) are characterized by Fourier Transform InfaRared (FTIR) spectroscopy combined with Surface-enhanced Raman scattering (SERS) spectroscopy. Surface Plasmon Resonance (SPR) technique based on FTIR spectroscopy has become an important tool in biophysics, which is perspective for the study of organic compounds. FTIR-spectra of MHDA shows the line at 2500 cm-1 attributed to thiol group which is modified by presence of Au-NPs, suggesting the formation of bond between thiol group and gold. We also can observe the peaks originate from characteristic chemical group. A Raman spectrum of the same sample is also promising. Our preliminary experiments confirm that SERS-effect takes place for MHDA connected with Au-NPs and enable us to detected small number (less than 106 cm-2) of MHDA molecules. Combination of spectroscopy methods: FTIR and SERS – enable to study optical properties of Au- NPs and immobilized bio-molecules in context of a bio-nano-sensors.

Keywords: Glutathione; gold nanoparticles, Fourier transform infrared spectroscopy, MHDA, surface-enhanced Raman scattering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3967
546 Earthquake Vulnerability and Repair Cost Estimation of Masonry Buildings in the Old City Center of Annaba, Algeria

Authors: Allaeddine Athmani, Abdelhacine Gouasmia, Tiago Ferreira, Romeu Vicente

Abstract:

The seismic risk mitigation from the perspective of the old buildings stock is truly essential in Algerian urban areas, particularly those located in seismic prone regions, such as Annaba city, and which the old buildings present high levels of degradation associated with no seismic strengthening and/or rehabilitation concerns. In this sense, the present paper approaches the issue of the seismic vulnerability assessment of old masonry building stocks through the adaptation of a simplified methodology developed for a European context area similar to that of Annaba city, Algeria. Therefore, this method is used for the first level of seismic vulnerability assessment of the masonry buildings stock of the old city center of Annaba. This methodology is based on a vulnerability index that is suitable for the evaluation of damage and for the creation of large-scale loss scenarios. Over 380 buildings were evaluated in accordance with the referred methodology and the results obtained were then integrated into a Geographical Information System (GIS) tool. Such results can be used by the Annaba city council for supporting management decisions, based on a global view of the site under analysis, which led to more accurate and faster decisions for the risk mitigation strategies and rehabilitation plans.

Keywords: Damage scenarios, masonry buildings, old city center, seismic vulnerability, vulnerability index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2080
545 Identification of the Electronic City Application Obstacles in Iran

Authors: E. Asgharizadeh, M. Ajalli Geshlajoughi, S. R. Safavi Mirmahalleh

Abstract:

Amazing development of the information technology, communications and internet expansion as well as the requirements of the city managers to new ideas to run the city and higher participation of the citizens encourage us to complete the electronic city as soon as possible. The foundations of this electronic city are in information technology. People-s participation in metropolitan management is a crucial topic. Information technology does not impede this matter. It can ameliorate populace-s participation and better interactions between the citizens and the city managers. Citizens can proffer their ideas, beliefs and votes through digital mass media based upon the internet and computerization plexuses on the topical matters to receive appropriate replies and services. They can participate in urban projects by becoming cognizant of the city views. The most significant challenges are as follows: information and communicative management, altering citizens- views, as well as legal and office documents Electronic city obstacles have been identified in this research. The required data were forgathered through questionnaires to identify the barriers from a statistical community comprising specialists and practitioners of the ministry of information technology and communication, the municipality information technology organization. The conclusions demonstrate that the prioritized electronic city application barriers in Iran are as follows: The support quandaries (non-financial ones), behavioral, cultural and educational plights, the security, legal and license predicaments, the hardware, orismological and infrastructural curbs, the software and fiscal problems.

Keywords: Electronic city, urban management, populace's participation, electronic government, electronic services, electronic organization, electronic infrastructure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1747
544 Aspects to Motivate users of a Design Engineering Wiki to Share their Knowledge

Authors: Regine W. Vroom, Lysanne E. Vossen, Anoek M. Geers

Abstract:

Industrial design engineering is an information and knowledge intensive job. Although Wikipedia offers a lot of this information, design engineers are better served with a wiki tailored to their job, offering information in a compact manner and functioning as a design tool. For that reason WikID has been developed. However for the viability of a wiki, an active user community is essential. The main subject of this paper is a study to the influence of the communication and the contents of WikID on the user-s willingness to contribute. At first the theory about a website-s first impression, general usability guidelines and user motivation in an online community is studied. Using this theory, the aspects of the current site are analyzed on their suitability. These results have been verified with a questionnaire amongst 66 industrial design engineers (or students industrial design engineering). The main conclusion is that design engineers are enchanted with the existence of WikID and its knowledge structure (taxonomy) but this structure has not become clear without any guidance. In other words, the knowledge structure is very helpful for inspiring and guiding design engineers through their tailored knowledge domain in WikID but this taxonomy has to be better communicated on the main page. Thereby the main page needs to be fitted more to the target group preferences.

Keywords: Industrial Design Engineering Knowledge, SemanticWiki, User Willingness to Contribute Knowledge to a Wiki, Influence of Website Content to User Activation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2086
543 Geovisualization of Tourist Activity Travel Patterns Using 3D GIS: An Empirical Study of Tamsui, Taiwan

Authors: Meng-Lung Lin, Chien-Min Chu, Chung-Hung Tsai, Chih-Cheng Chen, Chen-Yuan Chen

Abstract:

The study of tourist activities and the mapping of their routes in space and time has become an important issue in tourism management. Here we represent space-time paths for the tourism industry by visualizing individual tourist activities and the paths followed using a 3D Geographic Information System (GIS). Considerable attention has been devoted to the measurement of accessibility to shopping, eating, walking and other services at the tourist destination. I turns out that GIS is a useful tool for studying the spatial behaviors of tourists in the area. The value of GIS is especially advantageous for space-time potential path area measures, especially for the accurate visualization of possible paths through existing city road networks. This study seeks to apply space-time concepts with a detailed street network map obtained from Google Maps to measure tourist paths both spatially and temporally. These paths are further determined based on data obtained from map questionnaires regarding the trip activities of 40 individuals. The analysis of the data makes it possible to determining the locations of the more popular paths. The results can be visualized using 3D GIS to show the areas and potential activity opportunities accessible to tourists during their travel time.

Keywords: Tourist activity analysis, space-time path, GIS, geovisualization, activity-travel pattern.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2462
542 Using Fuzzy Logic Decision Support System to Predict the Lifted Weight for Students at Weightlifting Class

Authors: Ahmed Abdulghani Taha, Mohammad Abdulghani Taha

Abstract:

This study aims at being acquainted with the using the body fat percentage (%BF) with body Mass Index (BMI) as input parameters in fuzzy logic decision support system to predict properly the lifted weight for students at weightlifting class lift according to his abilities instead of traditional manner. The sample included 53 male students (age = 21.38 ± 0.71 yrs, height (Hgt) = 173.17 ± 5.28 cm, body weight (BW) = 70.34 ± 7.87.6 kg, Body mass index (BMI) 23.42 ± 2.06 kg.m-2, fat mass (FM) = 9.96 ± 3.15 kg and fat percentage (% BF) = 13.98 ± 3.51 %.) experienced the weightlifting class as a credit and has variance at BW, Hgt and BMI and FM. BMI and % BF were taken as input parameters in FUZZY logic whereas the output parameter was the lifted weight (LW). There were statistical differences between LW values before and after using fuzzy logic (Diff 3.55± 2.21, P > 0.001). The percentages of the LW categories proposed by fuzzy logic were 3.77% of students to lift 1.0 fold of their bodies; 50.94% of students to lift 0.95 fold of their bodies; 33.96% of students to lift 0.9 fold of their bodies; 3.77% of students to lift 0.85 fold of their bodies and 7.55% of students to lift 0.8 fold of their bodies. The study concluded that the characteristic changes in body composition experienced by students when undergoing weightlifting could be utilized side by side with the Fuzzy logic decision support system to determine the proper workloads consistent with the abilities of students.

Keywords: Fuzzy logic, body mass index, body fat percentage, weightlifting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1517
541 Perception of Secondary Schools’ Students on Computer Education in Federal Capital Territory (FCT-Abuja), Nigeria

Authors: Salako Emmanuel Adekunle

Abstract:

Computer education is referred to as the knowledge and ability to use computers and related technology efficiently, with a range of skills covering levels from basic use to advance. Computer continues to make an ever-increasing impact on all aspect of human endeavours such as education. With numerous benefits of computer education, what are the insights of students on computer education? This study investigated the perception of senior secondary school students on computer education in Federal Capital Territory (FCT), Abuja, Nigeria. A sample of 7500 senior secondary schools students was involved in the study, one hundred (100) private and fifty (50) public schools within FCT. They were selected by using simple random sampling technique. A questionnaire [PSSSCEQ] was developed and validated through expert judgement and reliability coefficient of 0.84 was obtained. It was used to gather relevant data on computer education. Findings confirmed that the students in the FCT had positive perception on computer education. Some factors were identified that affect students’ perception on computer education. The null hypotheses were tested using t-test and ANOVA statistical analyses at 0.05 level of significance. Based on these findings, some recommendations were made which include competent teachers should be employed into all secondary schools. This will help students to acquire relevant knowledge in computer education, technological supports should be provided to all secondary schools; this will help the users (students) to solve specific problems in computer education and financial supports should be provided to procure computer facilities that will enhance the teaching and the learning of computer education.

Keywords: Computer education, perception, secondary school, students.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4042
540 Finding Pareto Optimal Front for the Multi-Mode Time, Cost Quality Trade-off in Project Scheduling

Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo

Abstract:

Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.

Keywords: FastPGA, Multi-Execution Activity Mode, ParetoOptimality, Project Scheduling, Time-Cost-Quality Trade-Off.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
539 SolarSPELL Case Study: Pedagogical Quality Indicators to Evaluate Digital Library Resources

Authors: Lorena Alemán de la Garza, Marcela Georgina Gómez-Zermeño

Abstract:

This paper presents the SolarSPELL case study that aims to generate information on the use of indicators that help evaluate the pedagogical quality of a digital library resources. SolarSPELL is a solar-powered digital library with WiFi connectivity. It offers a variety of open educational resources selected for their potential for the digital transformation of educational practices and the achievement of the 2030 Agenda for Sustainable Development, adopted by all United Nations Member States. The case study employed a quantitative methodology and the research instrument was applied to 55 teachers, directors and librarians. The results indicate that it is possible to strengthen the pedagogical quality of open educational resources, through actions focused on improving temporal and technological parameters. They also reveal that users believe that SolarSPELL improves the teaching-learning processes and motivates the teacher to improve his or her development. This study provides valuable information on a tool that supports teaching-learning processes and facilitates connectivity with renewable energies that improves the teacher training in active methodologies for ecosystem learning.

Keywords: Educational innovation, digital library, pedagogical quality, solar energy, teacher training, sustainable development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 917
538 Development of Software Complex for Digitalization of Enterprise Activities

Authors: G. T. Balakayeva, K. K. Nurlybayeva, M. B. Zhanuzakov

Abstract:

In the proposed work, we have developed software and designed a software architecture for the implementation of enterprise business processes. The proposed software has a multi-level architecture using a domain-specific tool. The developed architecture is a guarantor of the availability, reliability and security of the system and the implementation of business processes, which are the basis for effective enterprise management. Automating business processes, automating the algorithmic stages of an enterprise, developing optimal algorithms for managing activities, controlling and monitoring, reducing risks and improving results help organizations achieve strategic goals quickly and efficiently. The software described in this article can connect to the corporate information system via two methods: a desktop client and a web client. With an appeal to the application server, the desktop client program connects to the information system on the company's work PCs over a local network. Outside the organization, the user can interact with the information system via a web browser, which acts as a web client and connects to a web server. The developed software consists of several integrated modules that share resources and interact with each other through an API. The following technology stack was used during development: Node js, React js, MongoDB, Ngnix, Cloud Technologies, Python.

Keywords: Algorithms, document processing, automation, integrated modules, software architecture, software design, information system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157
537 Numerical Simulations of Cross-Flow around Four Square Cylinders in an In-Line Rectangular Configuration

Authors: Shams Ul Islam, Chao Ying Zhou, Farooq Ahmad

Abstract:

A two-dimensional numerical simulation of crossflow around four cylinders in an in-line rectangular configuration is studied by using the lattice Boltzmann method (LBM). Special attention is paid to the effect of the spacing between the cylinders. The Reynolds number ( Re ) is chosen to be e 100 R = and the spacing ratio L / D is set at 0.5, 1.5, 2.5, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0 and 10.0. Results show that, as in the case of four cylinders in an inline rectangular configuration , flow fields show four different features depending on the spacing (single square cylinder, stable shielding flow, wiggling shielding flow and a vortex shedding flow) are observed in this study. The effects of spacing ratio on physical quantities such as mean drag coefficient, Strouhal number and rootmean- square value of the drag and lift coefficients are also presented. There is more than one shedding frequency at small spacing ratios. The mean drag coefficients for downstream cylinders are less than that of the single cylinder for all spacing ratios. The present results using the LBM are compared with some existing experimental data and numerical studies. The comparison shows that the LBM can capture the characteristics of the bluff body flow reasonably well and is a good tool for bluff body flow studies.

Keywords: Four square cylinders, Lattice Boltzmann method, rectangular configuration, spacing ratios, vortex shedding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2688
536 Characterization of Organic Matter in Spodosol Amazonian by Fluorescence Spectroscopy

Authors: Amanda M. Tadini, Houssam Hajjoul, Gustavo Nicolodelli, Stéphane Mounier, Célia R. Montes, Débora M. B. P. Milori

Abstract:

Soil organic matter (SOM) plays an important role in maintaining soil productivity and accounting for the promotion of biological diversity. The main components of the SOM are the humic substances which can be fractionated according to its solubility in humic acid (HA), fulvic acids (FA) and humin (HU). The determination of the chemical properties of organic matter as well as its interaction with metallic species is an important tool for understanding the structure of the humic fractions. Fluorescence spectroscopy has been studied as a source of information about what is happening at the molecular level in these compounds. Specially, soils of Amazon region are an important ecosystem of the planet. The aim of this study is to understand the molecular and structural composition of HA samples from Spodosol of Amazonia using the fluorescence Emission-Excitation Matrix (EEM) and Time Resolved Fluorescence Spectroscopy (TRFS). The results showed that the samples of HA showed two fluorescent components; one has a more complex structure and the other one has a simpler structure, which was also seen in TRFS through the evaluation of each sample lifetime. Thus, studies of this nature become important because it aims to evaluate the molecular and structural characteristics of the humic fractions in the region that is considered as one of the most important regions in the world, the Amazon.

Keywords: Amazonian soil, characterization, fluorescence, humic acid, lifetime.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1103
535 A Metric-Set and Model Suggestion for Better Software Project Cost Estimation

Authors: Murat Ayyıldız, Oya Kalıpsız, Sırma Yavuz

Abstract:

Software project effort estimation is frequently seen as complex and expensive for individual software engineers. Software production is in a crisis. It suffers from excessive costs. Software production is often out of control. It has been suggested that software production is out of control because we do not measure. You cannot control what you cannot measure. During last decade, a number of researches on cost estimation have been conducted. The metric-set selection has a vital role in software cost estimation studies; its importance has been ignored especially in neural network based studies. In this study we have explored the reasons of those disappointing results and implemented different neural network models using augmented new metrics. The results obtained are compared with previous studies using traditional metrics. To be able to make comparisons, two types of data have been used. The first part of the data is taken from the Constructive Cost Model (COCOMO'81) which is commonly used in previous studies and the second part is collected according to new metrics in a leading international company in Turkey. The accuracy of the selected metrics and the data samples are verified using statistical techniques. The model presented here is based on Multi-Layer Perceptron (MLP). Another difficulty associated with the cost estimation studies is the fact that the data collection requires time and care. To make a more thorough use of the samples collected, k-fold, cross validation method is also implemented. It is concluded that, as long as an accurate and quantifiable set of metrics are defined and measured correctly, neural networks can be applied in software cost estimation studies with success

Keywords: Software Metrics, Software Cost Estimation, Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1941
534 A Quantitative Approach to Strategic Design of Component-Based Business Process Models

Authors: Eakong Atiptamvaree, Twittie Senivongse

Abstract:

A new paradigm for software design and development models software by its business process, translates the model into a process execution language, and has it run by a supporting execution engine. This process-oriented paradigm promotes modeling of software by less technical users or business analysts as well as rapid development. Since business process models may be shared by different organizations and sometimes even by different business domains, it is interesting to apply a technique used in traditional software component technology to design reusable business processes. This paper discusses an approach to apply a technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses with an aim that the process components can be reusable in different process-based software models. The approach is quantitative because the quality of process component design is measured from technical features of the process components. The approach is also strategic because the measured quality is determined against business-oriented component management goals. A software tool has been developed to measure how good a process component design is, according to the required managerial goals and comparing to other designs. We also discuss how we benefit from reusable process components.

Keywords: Business process model, process component, component management goals, measurement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
533 Arabic Literature as a Tool for Educational Transformation in Nigeria

Authors: Abdulfatah A Raji

Abstract:

This paper started with the definitions of literature, Arabic literature, transformation and went further to highlight the components of educational transformation. The general history of Arabic literature was discussed with focus on how it undergoes some transformations from pre-Islamic period through Quranic era, Abbasid literature to renaissance period in which the modernization of Arabic literature started in Egypt. It also traces the spread of Arabic literature in Nigeria from the pre-colonial era during the Kanuri rulers to Jihad of Usman Dan Fodio and the development of literature which manifested to the Teacher’s Colleges and Bayero University in Northern Nigeria. Also, the establishment of primary and post-primary schools by Muslim organizations in many cities and towns of the Western part of Nigeria. Literary criticism was also discussed in line with Arabic literature. Poetry work of eminent poets were cited to show its importance in line with educational transformation in Nigerian literature and lessons from the cited Arabic poetry works were also highlighted to include: motivation to behave well and to tolerate others, better spirits of interaction, love and co-existence among different sexes, religion etc. All these can help in developing a better educational transformation in Nigeria which can in turn help in how to conduct researches for national development. The paper recommended compulsory Arabic literature at all levels of the nations’ educational system as well as publication of Arabic books and journals to encourage peace in this era of conflicts and further transform Nigeria’s educational system for better.

Keywords: Arabic, literature, peace, development, Nigeria

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1551
532 Hybrid Authentication System Using QR Code with OTP

Authors: Salim Istyaq

Abstract:

As we know, number of Internet users are increasing drastically. Now, people are using different online services provided by banks, colleges/schools, hospitals, online utility, bill payment and online shopping sites. To access online services, text-based authentication system is in use. The text-based authentication scheme faces some drawbacks with usability and security issues that bring troubles to users. The core element of computational trust is identity. The aim of the paper is to make the system more compliable for the imposters and more reliable for the users, by using the graphical authentication approach. In this paper, we are using the more powerful tool of encoding the options in graphical QR format and also there will be the acknowledgment which will send to the user’s mobile for final verification. The main methodology depends upon the encryption option and final verification by confirming a set of pass phrase on the legal users, the outcome of the result is very powerful as it only gives the result at once when the process is successfully done. All processes are cross linked serially as the output of the 1st process, is the input of the 2nd and so on. The system is a combination of recognition and pure recall based technique. Presented scheme is useful for devices like PDAs, iPod, phone etc. which are more handy and convenient to use than traditional desktop computer systems.

Keywords: Graphical Password, OTP, QR Codes, Recognition based graphical user authentication, usability and security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
531 Application of Generalized Stochastic Petri Nets(GSPN) in Modeling and Evaluating a Resource Sharing Flexible Manufacturing System

Authors: Aryanejad Mir Bahador Goli, Zahra Honarmand Shah Zileh

Abstract:

In most study fields, a phenomenon may not be studied directly but it will be examined indirectly by phenomenon model. Making an accurate model of system, there is attained new information from modeled phenomenon without any charge, danger, etc... there have been developed more solutions for describing and analyzing the recent complicated systems but few of them have analyzed the performance in the range of system description. Petri nets are of limited solutions which may make such union. Petri nets are being applied in problems related to modeling and designing the systems. Theory of Petri nets allow a system to model mathematically by a Petri net and analyzing the Petri net can then determine main information of modeled system-s structure and dynamic. This information can be used for assessing the performance of systems and suggesting corrections in the system. In this paper, beside the introduction of Petri nets, a real case study will be studied in order to show the application of generalized stochastic Petri nets in modeling a resource sharing production system and evaluating the efficiency of its machines and robots. The modeling tool used here is SHARP software which calculates specific indicators helping to make decision.

Keywords: Flexible manufacturing system, generalizedstochastic Petri nets, Markov chain, performance evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885
530 Spatial-Temporal Clustering Characteristics of Dengue in the Northern Region of Sri Lanka, 2010-2013

Authors: Sumiko Anno, Keiji Imaoka, Takeo Tadono, Tamotsu Igarashi, Subramaniam Sivaganesh, Selvam Kannathasan, Vaithehi Kumaran, Sinnathamby Noble Surendran

Abstract:

Dengue outbreaks are affected by biological, ecological, socio-economic and demographic factors that vary over time and space. These factors have been examined separately and still require systematic clarification. The present study aimed to investigate the spatial-temporal clustering relationships between these factors and dengue outbreaks in the northern region of Sri Lanka. Remote sensing (RS) data gathered from a plurality of satellites were used to develop an index comprising rainfall, humidity and temperature data. RS data gathered by ALOS/AVNIR-2 were used to detect urbanization, and a digital land cover map was used to extract land cover information. Other data on relevant factors and dengue outbreaks were collected through institutions and extant databases. The analyzed RS data and databases were integrated into geographic information systems, enabling temporal analysis, spatial statistical analysis and space-time clustering analysis. Our present results showed that increases in the number of the combination of ecological factor and socio-economic and demographic factors with above the average or the presence contribute to significantly high rates of space-time dengue clusters.

Keywords: ALOS/AVNIR-2, Dengue, Space-time clustering analysis, Sri Lanka.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2273
529 Risk Management and Security Practice in Customs Supply Chain: Application of Cross ABC Method to the Moroccan Customs

Authors: Lamia Hammadi, Abdellah Ait Ouhman, Aomar Ibourk

Abstract:

It is widely assumed that the case of Customs Supply Chain is classified as a complex system, due to not only the variety and large number of actors, but also their complex structural links, and the interactions between these actors, that’s why this system is subject to various types of Risks. The economic, political and social impacts of those risks are highly detrimental to countries, businesses and the public, for this reason, Risk management in the customs supply chain is becoming a crucial issue to ensure the sustainability, security and safety. The main characteristic of customs risk management approach is determining which goods and means of transport should be examined? To what extend? And where future compliance resources should be directed? The purposes of this article are, firstly to deal with the concept of customs supply chain, secondly present our risk management approach based on Cross Activity Based Costing (ABC) Method as an interactive tool to support decision making in customs risk management. Finally, analysis of case study of Moroccan customs to putting theory into practice and will thus draw together the various elements of a structured and efficient risk management approach.

Keywords: Cross ABC Method, Customs Supply Chain, Risk, Risk Management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3449
528 Malicious Vehicle Detection Using Monitoring Algorithm in Vehicular Adhoc Networks

Authors: S. Padmapriya

Abstract:

Vehicular Adhoc Networks (VANETs), a subset of Mobile Adhoc Networks (MANETs), refers to a set of smart vehicles used for road safety. This vehicle provides communication services among one another or with the Road Side Unit (RSU). Security is one of the most critical issues related to VANET as the information transmitted is distributed in an open access environment. As each vehicle is not a source of all messages, most of the communication depends on the information received from other vehicles. To protect VANET from malicious action, each vehicle must be able to evaluate, decide and react locally on the information received from other vehicles. Therefore, message verification is more challenging in VANET because of the security and privacy concerns of the participating vehicles. To overcome security threats, we propose Monitoring Algorithm that detects malicious nodes based on the pre-selected threshold value. The threshold value is compared with the distrust value which is inherently tagged with each vehicle. The proposed Monitoring Algorithm not only detects malicious vehicles, but also isolates the malicious vehicles from the network. The proposed technique is simulated using Network Simulator2 (NS2) tool. The simulation result illustrated that the proposed Monitoring Algorithm outperforms the existing algorithms in terms of malicious node detection, network delay, packet delivery ratio and throughput, thereby uplifting the overall performance of the network.

Keywords: VANET, security, malicious vehicle detection, threshold value, distrust value.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1278
527 Further Development in Predicting Post-Earthquake Fire Ignition Hazard

Authors: Pegah Farshadmanesh, Jamshid Mohammadi, Mehdi Modares

Abstract:

In nearly all earthquakes of the past century that resulted in moderate to significant damage, the occurrence of postearthquake fire ignition (PEFI) has imposed a serious hazard and caused severe damage, especially in urban areas. In order to reduce the loss of life and property caused by post-earthquake fires, there is a crucial need for predictive models to estimate the PEFI risk. The parameters affecting PEFI risk can be categorized as: 1) factors influencing fire ignition in normal (non-earthquake) condition, including floor area, building category, ignitability, type of appliance, and prevention devices, and 2) earthquake related factors contributing to the PEFI risk, including building vulnerability and earthquake characteristics such as intensity, peak ground acceleration, and peak ground velocity. State-of-the-art statistical PEFI risk models are solely based on limited available earthquake data, and therefore they cannot predict the PEFI risk for areas with insufficient earthquake records since such records are needed in estimating the PEFI model parameters. In this paper, the correlation between normal condition ignition risk, peak ground acceleration, and PEFI risk is examined in an effort to offer a means for predicting post-earthquake ignition events. An illustrative example is presented to demonstrate how such correlation can be employed in a seismic area to predict PEFI hazard.

Keywords: Fire risk, post-earthquake fire ignition (PEFI), risk management, seismicity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1350
526 Using Field Indices of Rill and Gully in order to Erosion Estimating and Sediment Analysis (Case Study: Menderjan Watershed in Isfahan Province, Iran)

Authors: Masoud Nasri, Sadat Feiznia, Mohammad Jafari, Hasan Ahmadi

Abstract:

Today, incorrect use of lands and land use changes, excessive grazing, no suitable using of agricultural farms, plowing on steep slopes, road construct, building construct, mine excavation etc have been caused increasing of soil erosion and sediment yield. For erosion and sediment estimation one can use statistical and empirical methods. This needs to identify land unit map and the map of effective factors. However, these empirical methods are usually time consuming and do not give accurate estimation of erosion. In this study, we applied GIS techniques to estimate erosion and sediment of Menderjan watershed at upstream Zayandehrud river in center of Iran. Erosion faces at each land unit were defined on the basis of land use, geology and land unit map using GIS. The UTM coordinates of each erosion type that showed more erosion amounts such as rills and gullies were inserted in GIS using GPS data. The frequency of erosion indicators at each land unit, land use and their sediment yield of these indices were calculated. Also using tendency analysis of sediment yield changes in watershed outlet (Menderjan hydrometric gauge station), was calculated related parameters and estimation errors. The results of this study according to implemented watershed management projects can be used for more rapid and more accurate estimation of erosion than traditional methods. These results can also be used for regional erosion assessment and can be used for remote sensing image processing.

Keywords: Erosion and sedimentation, Gully, Rill, GIS, GPS, Menderjan Watershed

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1892
525 Integration of Seismic and Seismological Data Interpretation for Subsurface Structure Identification

Authors: Iftikhar Ahmed Satti, Wan Ismail Wan Yusoff

Abstract:

The structural interpretation of a part of eastern Potwar (Missa Keswal) has been carried out with available seismological, seismic and well data. Seismological data contains both the source parameters and fault plane solution (FPS) parameters and seismic data contains ten seismic lines that were re-interpreted by using well data. Structural interpretation depicts two broad types of fault sets namely, thrust and back thrust faults. These faults together give rise to pop up structures in the study area and also responsible for many structural traps and seismicity. Seismic interpretation includes time and depth contour maps of Chorgali Formation while seismological interpretation includes focal mechanism solution (FMS), depth, frequency, magnitude bar graphs and renewal of Seismotectonic map. The Focal Mechanism Solutions (FMS) that surrounds the study area are correlated with the different geological and structural maps of the area for the determination of the nature of subsurface faults. Results of structural interpretation from both seismic and seismological data show good correlation. It is hoped that the present work will help in better understanding of the variations in the subsurface structure and can be a useful tool for earthquake prediction, planning of oil field and reservoir monitoring.

Keywords: Focal mechanism solution (FMS), Fault plane solution (FPS), Reservoir monitoring, earthquake prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2460
524 Statistical Modeling of Local Area Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes

Authors: Jihad S. Daba, J. P. Dubois

Abstract:

Fading noise degrades the performance of cellular communication, most notably in femto- and pico-cells in 3G and 4G systems. When the wireless channel consists of a small number of scattering paths, the statistics of fading noise is not analytically tractable and poses a serious challenge to developing closed canonical forms that can be analysed and used in the design of efficient and optimal receivers. In this context, noise is multiplicative and is referred to as stochastically local fading. In many analytical investigation of multiplicative noise, the exponential or Gamma statistics are invoked. More recent advances by the author of this paper utilized a Poisson modulated-weighted generalized Laguerre polynomials with controlling parameters and uncorrelated noise assumptions. In this paper, we investigate the statistics of multidiversity stochastically local area fading channel when the channel consists of randomly distributed Rayleigh and Rician scattering centers with a coherent Nakagami-distributed line of sight component and an underlying doubly stochastic Poisson process driven by a lognormal intensity. These combined statistics form a unifying triply stochastic filtered marked Poisson point process model.

Keywords: Cellular communication, femto- and pico-cells, stochastically local area fading channel, triply stochastic filtered marked Poisson point process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1332
523 Instant Location Detection of Objects Moving at High-Speedin C-OTDR Monitoring Systems

Authors: Andrey V. Timofeev

Abstract:

The practical efficient approach is suggested to estimate the high-speed objects instant bounds in C-OTDR monitoring systems. In case of super-dynamic objects (trains, cars) is difficult to obtain the adequate estimate of the instantaneous object localization because of estimation lag. In other words, reliable estimation coordinates of monitored object requires taking some time for data observation collection by means of C-OTDR system, and only if the required sample volume will be collected the final decision could be issued. But it is contrary to requirements of many real applications. For example, in rail traffic management systems we need to get data of the dynamic objects localization in real time. The way to solve this problem is to use the set of statistical independent parameters of C-OTDR signals for obtaining the most reliable solution in real time. The parameters of this type we can call as «signaling parameters» (SP). There are several the SP’s which carry information about dynamic objects instant localization for each of COTDR channels. The problem is that some of these parameters are very sensitive to dynamics of seismoacoustic emission sources, but are non-stable. On the other hand, in case the SP is very stable it becomes insensitive as rule. This report contains describing of the method for SP’s co-processing which is designed to get the most effective dynamic objects localization estimates in the C-OTDR monitoring system framework.

Keywords: C-OTDR-system, co-processing of signaling parameters, high-speed objects localization, multichannel monitoring systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1878