Search results for: methods of Lagrange multipliers.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4070

Search results for: methods of Lagrange multipliers.

3710 New Security Approach of Confidential Resources in Hybrid Clouds

Authors: Haythem Yahyaoui, Samir Moalla, Mounir Bouden, Skander Ghorbel

Abstract:

Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers. 

Keywords: Confidentiality, cryptography, security issues, trust issues.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1472
3709 Study on Crater Detection Using FLDA

Authors: Yoshiaki Takeda, Norifumi Aoyama, Takahiro Tanaami, Syouhei Honda, Kenta Tabata, Hiroyuki Kamata

Abstract:

In this paper, we validate crater detection in moon surface image using FLDA. This proposal assumes that it is applied to SLIM (Smart Lander for Investigating Moon) project aiming at the pin-point landing to the moon surface. The point where the lander should land is judged by the position relations of the craters obtained via camera, so the real-time image processing becomes important element. Besides, in the SLIM project, 400kg-class lander is assumed, therefore, high-performance computers for image processing cannot be equipped. We are studying various crater detection methods such as Haar-Like features, LBP, and PCA. And we think these methods are appropriate to the project, however, to identify the unlearned images obtained by actual is insufficient. In this paper, we examine the crater detection using FLDA, and compare with the conventional methods.

Keywords: Crater Detection, Fisher Linear Discriminant Analysis , Haar-Like Feature, Image Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1729
3708 Evaluation of the Hepatitis C Virus and Classical and Modern Immunoassays Used Nowadays to Diagnose It in Tirana

Authors: Stela Papa, Klementina Puto, Migena Pllaha

Abstract:

HCV is a hepatotropic RNA virus, transmitted primarily via the blood route, which causes progressive disease such as chronic hepatitis, liver cirrhosis, or hepatocellular carcinoma. HCV nowadays is a global healthcare problem. A variety of immunoassays including old and new technologies are being applied to detect HCV in our country. These methods include Immunochromatography assays (ICA), Fluorescence immunoassay (FIA), Enzyme linked fluorescent assay (ELFA), and Enzyme linked immunosorbent assay (ELISA) to detect HCV antibodies in blood serum, which lately is being slowly replaced by more sensitive methods such as rapid automated analyzer chemiluminescence immunoassay (CLIA). The aim of this study is to estimate HCV infection in carriers and chronic acute patients and to evaluate the use of new diagnostic methods. This study was realized from September 2016 to May 2018. During this study period, 2913 patients were analyzed for the presence of HCV by taking samples from their blood serum. The immunoassays performed were ICA, FIA, ELFA, ELISA, and CLIA assays. Concluding, 82% of patients taken in this study, resulted infected with HCV. Diagnostic methods in clinical laboratories are crucial in the early stages of infection, in the management of chronic hepatitis and in the treatment of patients during their disease.

Keywords: CLIA, ELISA, hepatitis C virus, immunoassay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 743
3707 Comparison of FAHP and TOPSIS for Evacuation Capability Assessment of High-rise Buildings

Authors: Peng Mei, Yan-Jun Qi, Yu Cui, Song Lu, He-Ping Zhang

Abstract:

A lot of computer-based methods have been developed to assess the evacuation capability (EC) of high-rise buildings. Because softwares are time-consuming and not proper for on scene applications, we adopted two methods, fuzzy analytic hierarchy process (FAHP) and technique for order preference by similarity to an ideal solution (TOPSIS), for EC assessment of a high-rise building in Jinan. The EC scores obtained with the two methods and the evacuation time acquired with Pathfinder 2009 for floors 47-60 of the building were compared with each other. The results show that FAHP performs better than TOPSIS for EC assessment of high-rise buildings, especially in the aspect of dealing with the effect of occupant type and distance to exit on EC, tackling complex problem with multi-level structure of criteria, and requiring less amount of computation. However, both FAHP and TOPSIS failed to appropriately handle the situation where the exit width changes while occupants are few.

Keywords: Evacuation capability assessment, FAHP, high-rise buildings, TOPSIS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620
3706 Modelling of Soil Erosion by Non Conventional Methods

Authors: Ganesh D. Kale, Sheela N. Vadsola

Abstract:

Soil erosion is the most serious problem faced at global and local level. So planning of soil conservation measures has become prominent agenda in the view of water basin managers. To plan for the soil conservation measures, the information on soil erosion is essential. Universal Soil Loss Equation (USLE), Revised Universal Soil Loss Equation 1 (RUSLE1or RUSLE) and Modified Universal Soil Loss Equation (MUSLE), RUSLE 1.06, RUSLE1.06c, RUSLE2 are most widely used conventional erosion estimation methods. The essential drawbacks of USLE, RUSLE1 equations are that they are based on average annual values of its parameters and so their applicability to small temporal scale is questionable. Also these equations do not estimate runoff generated soil erosion. So applicability of these equations to estimate runoff generated soil erosion is questionable. Data used in formation of USLE, RUSLE1 equations was plot data so its applicability at greater spatial scale needs some scale correction factors to be induced. On the other hand MUSLE is unsuitable for predicting sediment yield of small and large events. Although the new revised forms of USLE like RUSLE 1.06, RUSLE1.06c and RUSLE2 were land use independent and they have almost cleared all the drawbacks in earlier versions like USLE and RUSLE1, they are based on the regional data of specific area and their applicability to other areas having different climate, soil, land use is questionable. These conventional equations are applicable for sheet and rill erosion and unable to predict gully erosion and spatial pattern of rills. So the research was focused on development of nonconventional (other than conventional) methods of soil erosion estimation. When these non-conventional methods are combined with GIS and RS, gives spatial distribution of soil erosion. In the present paper the review of literature on non- conventional methods of soil erosion estimation supported by GIS and RS is presented.

Keywords: Conventional methods, GIS, non-conventionalmethods, remote sensing, soil erosion modeling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4291
3705 An Empirical Mode Decomposition Based Method for Action Potential Detection in Neural Raw Data

Authors: Sajjad Farashi, Mohammadjavad Abolhassani, Mostafa Taghavi Kani

Abstract:

Information in the nervous system is coded as firing patterns of electrical signals called action potential or spike so an essential step in analysis of neural mechanism is detection of action potentials embedded in the neural data. There are several methods proposed in the literature for such a purpose. In this paper a novel method based on empirical mode decomposition (EMD) has been developed. EMD is a decomposition method that extracts oscillations with different frequency range in a waveform. The method is adaptive and no a-priori knowledge about data or parameter adjusting is needed in it. The results for simulated data indicate that proposed method is comparable with wavelet based methods for spike detection. For neural signals with signal-to-noise ratio near 3 proposed methods is capable to detect more than 95% of action potentials accurately.

Keywords: EMD, neural data processing, spike detection, wavelet decomposition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2374
3704 Central Finite Volume Methods Applied in Relativistic Magnetohydrodynamics: Applications in Disks and Jets

Authors: Raphael de Oliveira Garcia, Samuel Rocha de Oliveira

Abstract:

We have developed a new computer program in Fortran 90, in order to obtain numerical solutions of a system of Relativistic Magnetohydrodynamics partial differential equations with predetermined gravitation (GRMHD), capable of simulating the formation of relativistic jets from the accretion disk of matter up to his ejection. Initially we carried out a study on numerical methods of unidimensional Finite Volume, namely Lax-Friedrichs, Lax-Wendroff, Nessyahu-Tadmor method and Godunov methods dependent on Riemann problems, applied to equations Euler in order to verify their main features and make comparisons among those methods. It was then implemented the method of Finite Volume Centered of Nessyahu-Tadmor, a numerical schemes that has a formulation free and without dimensional separation of Riemann problem solvers, even in two or more spatial dimensions, at this point, already applied in equations GRMHD. Finally, the Nessyahu-Tadmor method was possible to obtain stable numerical solutions - without spurious oscillations or excessive dissipation - from the magnetized accretion disk process in rotation with respect to a central black hole (BH) Schwarzschild and immersed in a magnetosphere, for the ejection of matter in the form of jet over a distance of fourteen times the radius of the BH, a record in terms of astrophysical simulation of this kind. Also in our simulations, we managed to get substructures jets. A great advantage obtained was that, with the our code, we got simulate GRMHD equations in a simple personal computer.

Keywords: Finite Volume Methods, Central Schemes, Fortran 90, Relativistic Astrophysics, Jet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2324
3703 Simulation of Sloshing behavior using Moving Grid and Body Force Methods

Authors: Tadashi Watanabe

Abstract:

The flow field and the motion of the free surface in an oscillating container are simulated numerically to assess the numerical approach for studying two-phase flows under oscillating conditions. Two numerical methods are compared: one is to model the oscillating container directly using the moving grid of the ALE method, and the other is to simulate the effect of container motion using the oscillating body force acting on the fluid in the stationary container. The two-phase flow field in the container is simulated using the level set method in both cases. It is found that the calculated results by the body force method coinsides with those by the moving grid method and the sloshing behavior is predicted well by both the methods. Theoretical back ground and limitation of the body force method are discussed, and the effects of oscillation amplitude and frequency are shown.

Keywords: Two-phase flow, simulation, oscillation, moving grid, body force

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
3702 Development of Elementary Literacy in the Czech Republic

Authors: Iva Košek Bartošová

Abstract:

There is great attention being paid in the field of development of first reading, thus early literacy skills in the Czech Republic. Yet inconclusive results of PISA and PIRLS force us to think over the teacher´s work, his/her roles in the education process and methods and forms used in lessons. There is also a significant importance to monitor the family environment and the pupil, themselves. The aim of the publishing output is to focus on one side dealing with methods of practicing reading technique and their results in the process of comprehension. In the first part of the contribution there are the goals of development of reading literacy and the methods used in reading practice in some EU countries and a follow-up comparison of research implemented by the help of modern technology of an eye tracker device in the year 2015 and a research conducted at the Institute of Education and Psychological Counselling of the Czech Republic in the year 2011/12. These are the results of a diagnostic test of reading in first classes of primary schools, taught by the genetic method and analytic-synthetic method. The results show that in the first stage of practice there are no statistically significant differences between any researched subjects taught by different methods of reading practice (with the use of several diagnostic texts focused on reading technique and its comprehension). Different results are shown at the end of Grade One and during Grade Two of primary school.

Keywords: Elementary literacy, eye tracker device, diagnostic reading tests, reading teaching method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1081
3701 Research on Software Security Testing

Authors: Gu Tian-yang, Shi Yin-sheng, Fang You-yuan

Abstract:

Software security testing is an important means to ensure software security and trustiness. This paper first mainly discusses the definition and classification of software security testing, and investigates methods and tools of software security testing widely. Then it analyzes and concludes the advantages and disadvantages of various methods and the scope of application, presents a taxonomy of security testing tools. Finally, the paper points out future focus and development directions of software security testing technology.

Keywords: security testing, security functional testing, securityvulnerability testing, testing method, testing tool

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5134
3700 The Contribution of Edgeworth, Bootstrap and Monte Carlo Methods in Financial Data

Authors: Edlira Donefski, Tina Donefski, Lorenc Ekonomi

Abstract:

Edgeworth Approximation, Bootstrap and Monte Carlo Simulations have a considerable impact on the achieving certain results related to different problems taken into study. In our paper, we have treated a financial case related to the effect that have the components of a Cash-Flow of one of the most successful businesses in the world, as the financial activity, operational activity and investing activity to the cash and cash equivalents at the end of the three-months period. To have a better view of this case we have created a Vector Autoregression model, and after that we have generated the impulse responses in the terms of Asymptotic Analysis (Edgeworth Approximation), Monte Carlo Simulations and Residual Bootstrap based on the standard errors of every series created. The generated results consisted of the common tendencies for the three methods applied, that consequently verified the advantage of the three methods in the optimization of the model that contains many variants.

Keywords: Autoregression, Bootstrap, Edgeworth Expansion, Monte Carlo Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 595
3699 Scatterer Density in Edge and Coherence Enhancing Nonlinear Anisotropic Diffusion for Medical Ultrasound Speckle Reduction

Authors: Ahmed Badawi, J. Michael Johnson, Mohamed Mahfouz

Abstract:

This paper proposes new enhancement models to the methods of nonlinear anisotropic diffusion to greatly reduce speckle and preserve image features in medical ultrasound images. By incorporating local physical characteristics of the image, in this case scatterer density, in addition to the gradient, into existing tensorbased image diffusion methods, we were able to greatly improve the performance of the existing filtering methods, namely edge enhancing (EE) and coherence enhancing (CE) diffusion. The new enhancement methods were tested using various ultrasound images, including phantom and some clinical images, to determine the amount of speckle reduction, edge, and coherence enhancements. Scatterer density weighted nonlinear anisotropic diffusion (SDWNAD) for ultrasound images consistently outperformed its traditional tensor-based counterparts that use gradient only to weight the diffusivity function. SDWNAD is shown to greatly reduce speckle noise while preserving image features as edges, orientation coherence, and scatterer density. SDWNAD superior performances over nonlinear coherent diffusion (NCD), speckle reducing anisotropic diffusion (SRAD), adaptive weighted median filter (AWMF), wavelet shrinkage (WS), and wavelet shrinkage with contrast enhancement (WSCE), make these methods ideal preprocessing steps for automatic segmentation in ultrasound imaging.

Keywords: Nonlinear anisotropic diffusion, ultrasound imaging, speckle reduction, scatterer density estimation, edge based enhancement, coherence enhancement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1906
3698 The Effectiveness of Implementing Interactive Training for Teaching Kazakh Language

Authors: Samal Abzhanova, Saule Mussabekova

Abstract:

Today, a new system of education is being created in Kazakhstan in order to develop the system of education and to satisfy the world class standards. For this purpose, there have been established new requirements and responsibilities to the instructors. Students should not be limited with providing only theoretical knowledge. Also, they should be encouraged to be competitive, to think creatively and critically. Moreover, students should be able to implement these skills into practice. These issues could be resolved through the permanent improvement of teaching methods. Therefore, a specialist who teaches the languages should use up-to-date methods and introduce new technologies. The result of the investigation suggests that an interactive teaching method is one of the new technologies in this field. This paper aims to provide information about implementing new technologies in the process of teaching language. The paper will discuss about necessity of introducing innovative technologies and the techniques of organizing interactive lessons. At the same time, the structure of the interactive lesson, conditions, principles, discussions, small group works and role-playing games will be considered. Interactive methods are carried out with the help of several types of activities, such as working in a team (with two or more group of people), playing situational or role-playing games, working with different sources of information, discussions, presentations, creative works and learning through solving situational tasks and etc.

Keywords: Games, interactive learning, Kazakh language, teaching methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1433
3697 Dynamic Construction Site Layout Using Ant Colony Optimization

Authors: Y. Abdelrazig

Abstract:

Evolutionary optimization methods such as genetic algorithms have been used extensively for the construction site layout problem. More recently, ant colony optimization algorithms, which are evolutionary methods based on the foraging behavior of ants, have been successfully applied to benchmark combinatorial optimization problems. This paper proposes a formulation of the site layout problem in terms of a sequencing problem that is suitable for solution using an ant colony optimization algorithm. In the construction industry, site layout is a very important planning problem. The objective of site layout is to position temporary facilities both geographically and at the correct time such that the construction work can be performed satisfactorily with minimal costs and improved safety and working environment. During the last decade, evolutionary methods such as genetic algorithms have been used extensively for the construction site layout problem. This paper proposes an ant colony optimization model for construction site layout. A simple case study for a highway project is utilized to illustrate the application of the model.

Keywords: Construction site layout, optimization, ant colony.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3125
3696 Influence of Differences of Heat Insulation Methods on Thermal Comfort of Apartment Buildings

Authors: Hikaru Sato, Hiroatsu Fukuda, Yupeng Wang

Abstract:

The aim of this study is to analyze influence of differences of heat insulation methods on indoor thermal environment and comfort of apartment buildings. This study analyzes indoor thermal environment and comfort on units of apartment buildings using calculation software "THERB" and compares three different kinds of heat insulation methods. Those are outside insulation on outside walls, inside insulation on outside walls and interior insulation. In terms of indoor thermal environment, outside insulation is the best to stabilize room temperature. In winter, room temperature on outside insulation after heating is higher than other and it is kept 3-5 degrees higher through all night. But the surface temperature with outside insulation did not dramatically increase when heating was used, which was 3 to 5oC lower than the temperature with other insulation. The PMV of interior insulation fall nearly range of comfort when the heating and cooling was use.

Keywords: Apartment Building, Indoor Thermal Environment, Insulation, PMV

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1360
3695 How to Build and Evaluate a Solution Method: An Illustration for the Vehicle Routing Problem

Authors: Nicolas Zufferey

Abstract:

The vehicle routing problem (VRP) is a famous combinatorial optimization problem. Because of its well-known difficulty, metaheuristics are the most appropriate methods to tackle large and realistic instances. The goal of this paper is to highlight the key ideas for designing VRP metaheuristics according to the following criteria: efficiency, speed, robustness, and ability to take advantage of the problem structure. Such elements can obviously be used to build solution methods for other combinatorial optimization problems, at least in the deterministic field.

Keywords: Vehicle routing problem, Metaheuristics, Combinatorial optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2076
3694 A Comparative Study on ANN, ANFIS and SVM Methods for Computing Resonant Frequency of A-Shaped Compact Microstrip Antennas

Authors: Ahmet Kayabasi, Ali Akdagli

Abstract:

In this study, three robust predicting methods, namely artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for computing the resonant frequency of A-shaped compact microstrip antennas (ACMAs) operating at UHF band. Firstly, the resonant frequencies of 144 ACMAs with various dimensions and electrical parameters were simulated with the help of IE3D™ based on method of moment (MoM). The ANN, ANFIS and SVM models for computing the resonant frequency were then built by considering the simulation data. 124 simulated ACMAs were utilized for training and the remaining 20 ACMAs were used for testing the ANN, ANFIS and SVM models. The performance of the ANN, ANFIS and SVM models are compared in the training and test process. The average percentage errors (APE) regarding the computed resonant frequencies for training of the ANN, ANFIS and SVM were obtained as 0.457%, 0.399% and 0.600%, respectively. The constructed models were then tested and APE values as 0.601% for ANN, 0.744% for ANFIS and 0.623% for SVM were achieved. The results obtained here show that ANN, ANFIS and SVM methods can be successfully applied to compute the resonant frequency of ACMAs, since they are useful and versatile methods that yield accurate results.

Keywords: A-shaped compact microstrip antenna, Artificial Neural Network (ANN), adaptive Neuro-Fuzzy Inference System (ANFIS), Support Vector Machine (SVM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2215
3693 Measurement and Estimation of Evaporation from Water Surfaces: Application to Dams in Arid and Semi Arid Areas in Algeria

Authors: Malika Fekih, Mohamed Saighi

Abstract:

Many methods exist for either measuring or estimating evaporation from free water surfaces. Evaporation pans provide one of the simplest, inexpensive, and most widely used methods of estimating evaporative losses. In this study, the rate of evaporation starting from a water surface was calculated by modeling with application to dams in wet, arid and semi arid areas in Algeria. We calculate the evaporation rate from the pan using the energy budget equation, which offers the advantage of an ease of use, but our results do not agree completely with the measurements taken by the National Agency of areas carried out using dams located in areas of different climates. For that, we develop a mathematical model to simulate evaporation. This simulation uses an energy budget on the level of a vat of measurement and a Computational Fluid Dynamics (Fluent). Our calculation of evaporation rate is compared then by the two methods and with the measures of areas in situ.

Keywords: Evaporation, Energy budget, Surface water temperature, CFD, Dams

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5751
3692 Fuzzy Time Series Forecasting Using Percentage Change as the Universe of Discourse

Authors: Meredith Stevenson, John E. Porter

Abstract:

Since the pioneering work of Zadeh, fuzzy set theory has been applied to a myriad of areas. Song and Chissom introduced the concept of fuzzy time series and applied some methods to the enrollments of the University of Alabama. In recent years, a number of techniques have been proposed for forecasting based on fuzzy set theory methods. These methods have either used enrollment numbers or differences of enrollments as the universe of discourse. We propose using the year to year percentage change as the universe of discourse. In this communication, the approach of Jilani, Burney, and Ardil is modified by using the year to year percentage change as the universe of discourse. We use enrollment figures for the University of Alabama to illustrate our proposed method. The proposed method results in better forecasting accuracy than existing models.

Keywords: Fuzzy forecasting, fuzzy time series, fuzzified enrollments, time-invariant model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2504
3691 An Approach to Correlate the Statistical-Based Lorenz Method, as a Way of Measuring Heterogeneity, with Kozeny-Carman Equation

Authors: H. Khanfari, M. Johari Fard

Abstract:

Dealing with carbonate reservoirs can be mind-boggling for the reservoir engineers due to various digenetic processes that cause a variety of properties through the reservoir. A good estimation of the reservoir heterogeneity which is defined as the quality of variation in rock properties with location in a reservoir or formation, can better help modeling the reservoir and thus can offer better understanding of the behavior of that reservoir. Most of reservoirs are heterogeneous formations whose mineralogy, organic content, natural fractures, and other properties vary from place to place. Over years, reservoir engineers have tried to establish methods to describe the heterogeneity, because heterogeneity is important in modeling the reservoir flow and in well testing. Geological methods are used to describe the variations in the rock properties because of the similarities of environments in which different beds have deposited in. To illustrate the heterogeneity of a reservoir vertically, two methods are generally used in petroleum work: Dykstra-Parsons permeability variations (V) and Lorenz coefficient (L) that are reviewed briefly in this paper. The concept of Lorenz is based on statistics and has been used in petroleum from that point of view. In this paper, we correlated the statistical-based Lorenz method to a petroleum concept, i.e. Kozeny-Carman equation and derived the straight line plot of Lorenz graph for a homogeneous system. Finally, we applied the two methods on a heterogeneous field in South Iran and discussed each, separately, with numbers and figures. As expected, these methods show great departure from homogeneity. Therefore, for future investment, the reservoir needs to be treated carefully.

Keywords: Carbonate reservoirs, heterogeneity, homogeneous system, Dykstra-Parsons permeability variations (V), Lorenz coefficient (L).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1790
3690 Research of Database Curriculum Construction under the Environment of Massive Open Online Courses

Authors: Wang Zhanquan, Yang Zeping, Gu Chunhua, Zhu Fazhi, Guo Weibin

Abstract:

Recently, Massive Open Online Courses (MOOCs) are becoming the new trend of education. There are many problems under the environment of Database Principle curriculum teaching process in MOOCs, such as teaching ideas and theories which are out of touch with the reality, how to carry out the technical teaching and interactive practice in the MOOCs environment, thus the methods of database course under the environment of MOOCs are proposed. There are three processes to deal with problem solving in the research, which are problems proposed, problems solved, and inductive analysis. The present research includes the design of teaching contents, teaching methods in classroom, flipped classroom teaching mode under the environment of MOOCs, learning flow method and large practice homework. The database designing ability is systematically improved based on the researching methods.

Keywords: Problem solving-driven, MOOCs, teaching art, learning flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1334
3689 Texture Feature Extraction of Infrared River Ice Images using Second-Order Spatial Statistics

Authors: Bharathi P. T, P. Subashini

Abstract:

Ice cover County has a significant impact on rivers as it affects with the ice melting capacity which results in flooding, restrict navigation, modify the ecosystem and microclimate. River ices are made up of different ice types with varying ice thickness, so surveillance of river ice plays an important role. River ice types are captured using infrared imaging camera which captures the images even during the night times. In this paper the river ice infrared texture images are analysed using first-order statistical methods and secondorder statistical methods. The second order statistical methods considered are spatial gray level dependence method, gray level run length method and gray level difference method. The performance of the feature extraction methods are evaluated by using Probabilistic Neural Network classifier and it is found that the first-order statistical method and second-order statistical method yields low accuracy. So the features extracted from the first-order statistical method and second-order statistical method are combined and it is observed that the result of these combined features (First order statistical method + gray level run length method) provides higher accuracy when compared with the features from the first-order statistical method and second-order statistical method alone.

Keywords: Gray Level Difference Method, Gray Level Run Length Method, Kurtosis, Probabilistic Neural Network, Skewness, Spatial Gray Level Dependence Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2908
3688 Bootstrap and MLS Methods-based Individual Bioequivalence Assessment

Authors: Kongsheng Zhang, Li Ge

Abstract:

It is a one-sided hypothesis testing process for assessing bioequivalence. Bootstrap and modified large-sample(MLS) methods are considered to study individual bioequivalence(IBE), type I error and power of hypothesis tests are simulated and compared with FDA(2001). The results show that modified large-sample method is equivalent to the method of FDA(2001) .

Keywords: Individual bioequivalence, bootstrap, Bayesian bootstrap, modified large-sample.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1584
3687 Transforming Construction: Integrating Off-Site Techniques and Advanced Technologies

Authors: Layla Mujahed, Gang Feng, Jianghua Wang

Abstract:

An increasing number of construction projects are adopting off-site construction techniques over traditional methods to address longstanding challenges. This research paper explores the integration of design for manufacture and assembly (DfMA), modern methods of construction (MMC), and building information modeling (BIM) within the construction industry. This study employs a mixed-methods approach, using case studies and a review of the existing literature, to examine the role and combined application of each methodology in building projects of varying scales and durations. The study focuses on application mechanisms, stakeholder engagement, knowledge sharing, feedback, and performance metrics to explore the benefits, challenges, and transformative potential of integrating these methodologies. The findings indicate that the synergy among DfMA, MMC, and BIM significantly improves project efficiency, cost reduction, and overall quality. Standardization, increased collaboration among stakeholders, and the adoption of advanced technologies are also highlighted as necessary considerations to fully realize the benefits of this integration. The paper concludes with practical recommendations for industry practitioners seeking to efficiently implement these integrated approaches.

Keywords: BIM, building information modeling, case study, DfMA, design for manufacture and assembly, MMC, modern methods of construction, prefabrication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 73
3686 Cloud Computing Support for Diagnosing Researches

Authors: A. Amirov, O. Gerget, V. Kochegurov

Abstract:

One of the main biomedical problem lies in detecting dependencies in semi structured data. Solution includes biomedical portal and algorithms (integral rating health criteria, multidimensional data visualization methods). Biomedical portal allows to process diagnostic and research data in parallel mode using Microsoft System Center 2012, Windows HPC Server cloud technologies. Service does not allow user to see internal calculations instead it provides practical interface. When data is sent for processing user may track status of task and will achieve results as soon as computation is completed. Service includes own algorithms and allows diagnosing and predicating medical cases. Approved methods are based on complex system entropy methods, algorithms for determining the energy patterns of development and trajectory models of biological systems and logical–probabilistic approach with the blurring of images.

Keywords: Biomedical portal, cloud computing, diagnostic and prognostic research, mathematical data analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1644
3685 Evaluation of Ensemble Classifiers for Intrusion Detection

Authors: M. Govindarajan

Abstract:

One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed with homogeneous ensemble classifier using bagging and heterogeneous ensemble classifier using arcing and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. The feasibility and the benefits of the proposed approaches are demonstrated by the means of standard datasets of intrusion detection. The main originality of the proposed approach is based on three main parts: preprocessing phase, classification phase, and combining phase. A wide range of comparative experiments is conducted for standard datasets of intrusion detection. The performance of the proposed homogeneous and heterogeneous ensemble classifiers are compared to the performance of other standard homogeneous and heterogeneous ensemble methods. The standard homogeneous ensemble methods include Error correcting output codes, Dagging and heterogeneous ensemble methods include majority voting, stacking. The proposed ensemble methods provide significant improvement of accuracy compared to individual classifiers and the proposed bagged RBF and SVM performs significantly better than ECOC and Dagging and the proposed hybrid RBF-SVM performs significantly better than voting and stacking. Also heterogeneous models exhibit better results than homogeneous models for standard datasets of intrusion detection. 

Keywords: Data mining, ensemble, radial basis function, support vector machine, accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700
3684 Application of Artificial Neural Network in Assessing Fill Slope Stability

Authors: An-Jui. Li, Kelvin Lim, Chien-Kuo Chiu, Benson Hsiung

Abstract:

This paper details the utilization of artificial intelligence (AI) in the field of slope stability whereby quick and convenient solutions can be obtained using the developed tool. The AI tool used in this study is the artificial neural network (ANN), while the slope stability analysis methods are the finite element limit analysis methods. The developed tool allows for the prompt prediction of the safety factors of fill slopes and their corresponding probability of failure (depending on the degree of variation of the soil parameters), which can give the practicing engineer a reasonable basis in their decision making. In fact, the successful use of the Extreme Learning Machine (ELM) algorithm shows that slope stability analysis is no longer confined to the conventional methods of modeling, which at times may be tedious and repetitive during the preliminary design stage where the focus is more on cost saving options rather than detailed design. Therefore, similar ANN-based tools can be further developed to assist engineers in this aspect.

Keywords: Landslide, limit analysis, ANN, soil properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1207
3683 Virtual Reality Learning Environment in Embryology Education

Authors: Salsabeel F. M. Alfalah, Jannat F. Falah, Nadia Muhaidat, Amjad Hudaib, Diana Koshebye, Sawsan AlHourani

Abstract:

Educational technology is changing the way how students engage and interact with learning materials. This improved the learning process amongst various subjects. Virtual Reality (VR) applications are considered one of the evolving methods that have contributed to enhancing medical education. This paper utilizes VR to provide a solution to improve the delivery of the subject of Embryology to medical students, and facilitate the teaching process by providing a useful aid to lecturers, whilst proving the effectiveness of this new technology in this particular area. After evaluating the current teaching methods and identifying students ‘needs, a VR system was designed that demonstrates in an interactive fashion the development of the human embryo from fertilization to week ten of intrauterine development. This system aims to overcome some of the problems faced by the students’ in the current educational methods, and to increase the efficacy of the learning process.

Keywords: Virtual reality, student assessment, medical education, 3D, embryology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 862
3682 Ranking Fuzzy Numbers Based on Lexicographical Ordering

Authors: B. Farhadinia

Abstract:

Although so far, many methods for ranking fuzzy numbers have been discussed broadly, most of them contained some shortcomings, such as requirement of complicated calculations, inconsistency with human intuition and indiscrimination. The motivation of this study is to develop a model for ranking fuzzy numbers based on the lexicographical ordering which provides decision-makers with a simple and efficient algorithm to generate an ordering founded on a precedence. The main emphasis here is put on the ease of use and reliability. The effectiveness of the proposed method is finally demonstrated by including a comprehensive comparing different ranking methods with the present one.

Keywords: Ranking fuzzy numbers, Lexicographical ordering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1810
3681 Alternative Methods to Rank the Impact of Object Oriented Metrics in Fault Prediction Modeling using Neural Networks

Authors: Kamaldeep Kaur, Arvinder Kaur, Ruchika Malhotra

Abstract:

The aim of this paper is to rank the impact of Object Oriented(OO) metrics in fault prediction modeling using Artificial Neural Networks(ANNs). Past studies on empirical validation of object oriented metrics as fault predictors using ANNs have focused on the predictive quality of neural networks versus standard statistical techniques. In this empirical study we turn our attention to the capability of ANNs in ranking the impact of these explanatory metrics on fault proneness. In ANNs data analysis approach, there is no clear method of ranking the impact of individual metrics. Five ANN based techniques are studied which rank object oriented metrics in predicting fault proneness of classes. These techniques are i) overall connection weights method ii) Garson-s method iii) The partial derivatives methods iv) The Input Perturb method v) the classical stepwise methods. We develop and evaluate different prediction models based on the ranking of the metrics by the individual techniques. The models based on overall connection weights and partial derivatives methods have been found to be most accurate.

Keywords: Artificial Neural Networks (ANNS), Backpropagation, Fault Prediction Modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1757