Search results for: parametric survival models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8161

Search results for: parametric survival models

6931 Nonlinear Free Vibrations of Functionally Graded Cylindrical Shells

Authors: Alexandra Andrade Brandão Soares, Paulo Batista Gonçalves

Abstract:

Using a modal expansion that satisfies the boundary and continuity conditions and expresses the modal couplings characteristic of cylindrical shells in the nonlinear regime, the equations of motion are discretized using the Galerkin method. The resulting algebraic equations are solved by the Newton-Raphson method, thus obtaining the nonlinear frequency-amplitude relation. Finally, a parametric analysis is conducted to study the influence of the geometry of the shell, the gradient of the functional material and vibration modes on the degree and type of nonlinearity of the cylindrical shell, which is the main contribution of this research work.

Keywords: cylindrical shells, dynamics, functionally graded material, nonlinear vibrations

Procedia PDF Downloads 47
6930 Novel Microstrip MIMO Antenna for 3G/4G Applications

Authors: Sandro Samir Nasief, Hussein Hamed Ghouz, Mohamed Fathy

Abstract:

A compact ultra-wide band micro-strip MIMO antenna is introduced. The antenna consists of two elements each of size 24X24 mm2 (square millimetre) while the total MIMO size is 58X24 mm2 after the spacing between MIMO elements and adding a decouple circuit. The first one covers from 3.29 to 6.9 GHZ using digital ground and the second antenna covers from 8.76 to 13.27 GHZ using defective ground. This type of antenna is used for 3G and 4G applications. The introduction for the antenna structure and the parametric study (reflection coefficients, gain, coupling and decoupling) will be introduced.

Keywords: micro-strip antenna, MIMO, digital ground, defective ground, decouple circuit, bandwidth

Procedia PDF Downloads 348
6929 Using Simulation Modeling Approach to Predict USMLE Steps 1 and 2 Performances

Authors: Chau-Kuang Chen, John Hughes, Jr., A. Dexter Samuels

Abstract:

The prediction models for the United States Medical Licensure Examination (USMLE) Steps 1 and 2 performances were constructed by the Monte Carlo simulation modeling approach via linear regression. The purpose of this study was to build robust simulation models to accurately identify the most important predictors and yield the valid range estimations of the Steps 1 and 2 scores. The application of simulation modeling approach was deemed an effective way in predicting student performances on licensure examinations. Also, sensitivity analysis (a/k/a what-if analysis) in the simulation models was used to predict the magnitudes of Steps 1 and 2 affected by changes in the National Board of Medical Examiners (NBME) Basic Science Subject Board scores. In addition, the study results indicated that the Medical College Admission Test (MCAT) Verbal Reasoning score and Step 1 score were significant predictors of the Step 2 performance. Hence, institutions could screen qualified student applicants for interviews and document the effectiveness of basic science education program based on the simulation results.

Keywords: prediction model, sensitivity analysis, simulation method, USMLE

Procedia PDF Downloads 329
6928 Mathematical Modeling of the Fouling Phenomenon in Ultrafiltration of Latex Effluent

Authors: Amira Abdelrasoul, Huu Doan, Ali Lohi

Abstract:

An efficient and well-planned ultrafiltration process is becoming a necessity for monetary returns in the industrial settings. The aim of the present study was to develop a mathematical model for an accurate prediction of ultrafiltration membrane fouling of latex effluent applied to homogeneous and heterogeneous membranes with uniform and non-uniform pore sizes, respectively. The models were also developed for an accurate prediction of power consumption that can handle the large-scale purposes. The model incorporated the fouling attachments as well as chemical and physical factors in membrane fouling for accurate prediction and scale-up application. Both Polycarbonate and Polysulfone flat membranes, with pore sizes of 0.05 µm and a molecular weight cut-off of 60,000, respectively, were used under a constant feed flow rate and a cross-flow mode in ultrafiltration of the simulated paint effluent. Furthermore, hydrophilic ultrafilic and hydrophobic PVDF membranes with MWCO of 100,000 were used to test the reliability of the models. Monodisperse particles of 50 nm and 100 nm in diameter, and a latex effluent with a wide range of particle size distributions were utilized to validate the models. The aggregation and the sphericity of the particles indicated a significant effect on membrane fouling.

Keywords: membrane fouling, mathematical modeling, power consumption, attachments, ultrafiltration

Procedia PDF Downloads 462
6927 Knitting Stitches’ Manipulation for Catenary Textile Structures

Authors: Virginia Melnyk

Abstract:

This paper explores the design for catenary structure using knitted textiles. Using the advantages of Grasshopper and Kangaroo parametric software to simulate and pre-design an overall form, the design is then translated to a pattern that can be made with hand manipulated stitches on a knitting machine. The textile takes advantage of the structure of knitted materials and the ability for it to stretch. Using different types of stitches to control the amount of stretch that can occur in portions of the textile generates an overall formal design. The textile is then hardened in an upside-down hanging position and then flipped right-side-up. This then becomes a structural catenary form. The resulting design is used as a small Cat House for a cat to sit inside and climb on top of.

Keywords: architectural materials, catenary structures, knitting fabrication, textile design

Procedia PDF Downloads 173
6926 Stochastic Richelieu River Flood Modeling and Comparison of Flood Propagation Models: WMS (1D) and SRH (2D)

Authors: Maryam Safrai, Tewfik Mahdi

Abstract:

This article presents the stochastic modeling of the Richelieu River flood in Quebec, Canada, occurred in the spring of 2011. With the aid of the one-dimensional Watershed Modeling System (WMS (v.10.1) and HEC-RAS (v.4.1) as a flood simulator, the delineation of the probabilistic flooded areas was considered. Based on the Monte Carlo method, WMS (v.10.1) delineated the probabilistic flooded areas with corresponding occurrence percentages. Furthermore, results of this one-dimensional model were compared with the results of two-dimensional model (SRH-2D) for the evaluation of efficiency and precision of each applied model. Based on this comparison, computational process in two-dimensional model is longer and more complicated versus brief one-dimensional one. Although, two-dimensional models are more accurate than one-dimensional method, but according to existing modellers, delineation of probabilistic flooded areas based on Monte Carlo method is achievable via one-dimensional modeler. The applied software in this case study greatly responded to verify the research objectives. As a result, flood risk maps of the Richelieu River with the two applied models (1d, 2d) could elucidate the flood risk factors in hydrological, hydraulic, and managerial terms.

Keywords: flood modeling, HEC-RAS, model comparison, Monte Carlo simulation, probabilistic flooded area, SRH-2D, WMS

Procedia PDF Downloads 129
6925 Analysis of Autoantibodies to the S-100 Protein, NMDA, and Dopamine Receptors in Children with Type 1 Diabetes Mellitus

Authors: Yuri V. Bykov, V. A. Baturin

Abstract:

Aim of the study: The aim of the study was to perform a comparative analysis of the levels of autoantibodies (AAB) to the S-100 protein as well as to the dopamine and NMDA receptors in children with type 1 diabetes mellitus (DM) in therapeutic remission. Materials and methods: Blood serum obtained from 42 children ages 4 to 17 years (20 boys and 22 girls) was analyzed. Twenty-one of these children had a diagnosis of type 1 DM and were in therapeutic remission (study group). The mean duration of disease in children with type 1 DM was 9.6±0.36 years. Children without DM were included in a group of "apparently healthy children" (21 children, comparison group). AAB to the S-100 protein, the dopamine, and NMDA receptors were measured by ELISA. The normal range of IgG AAB was specified as up to 10 µg/mL. In order to compare the central parameters of the groups, the following parametric and non-parametric methods were used: Student's t-test or Mann-Whitney U test. The level of significance for inter-group comparisons was set at p<0.05. Results: The mean levels of AAB to the S-100B protein were significantly higher (p=0.0045) in children with DM (16.84±1.54 µg/mL) when compared with "apparently healthy children" (2.09±0.05 µg/mL). The detected elevated levels of AAB to NMDA receptors may indicate that in children with type 1 DM, there is a change in the activity of the glutamatergic system, which in its turn suggests the presence of excitotoxicity. The mean levels of AAB to dopamine receptors were higher (p=0.0082) in patients comprising the study group than in the children of the comparison group (40.47±2.31 µg/mL and 3.91±0.09 µg/mL). The detected elevated levels of AAB to dopamine receptors suggest an altered activity of the dopaminergic system in children with DM. This can also be viewed as indirect evidence of altered activity of the brain's glutamatergic system. The mean levels of AAB to NMDA receptors were higher in patients with type 1 DM compared with the "apparently healthy children," at 13.16±2.07 µg/mL and 1.304±0.05 µg/mL, respectively (p=0.0021). The elevated mean levels of AAB to the S-100B protein may indicate damage to brain tissue in children with type 1 DM. A difference was also detected between the mean values of the measured AABs, and this difference depended on the duration of the disease: mean AAB values were significantly higher in patients whose disease had lasted more than five years. Conclusions: The elevated mean levels of AAB to the S-100B protein may indicate damage to brain tissue in the setting of excitotoxicity in children with type 1 DM. The discovered elevation of the levels of AAB to NMDA and dopamine receptors may indicate the activation of the glutamatergic and dopaminergic systems. The observed abnormalities indicate the presence of central nervous system damage in children with type 1 DM, with a tendency towards the elevation of the levels of the studied AABs with disease progression.

Keywords: autoantibodies, brain damage, children, diabetes mellitus

Procedia PDF Downloads 82
6924 Speed Power Control of Double Field Induction Generator

Authors: Ali Mausmi, Ahmed Abbou, Rachid El Akhrif

Abstract:

This research paper aims to reduce the chattering phenomenon due to control by sliding mode control applied on a wind energy conversion system based on the doubly fed induction generator (DFIG). Our goal is to offset the effect of parametric uncertainties and come as close as possible to the dynamic response solicited by the control law in the ideal case and therefore force the active and reactive power generated by the DFIG to accurately follow the reference values which are provided to it. The simulation results using Matlab / Simulink demonstrate the efficiency and performance of the proposed technique while maintaining the simplicity of control by first order sliding mode.

Keywords: control of speed, correction of the equivalent command, induction generator, sliding mode

Procedia PDF Downloads 366
6923 Anticoccidial Effects of the Herbal Mixture in Boilers after Eimeria spp. Infection

Authors: Yang-Ho Jang, Soon-Ok Jee, Hae-Chul Park, Jeong-Woo Kang, Byung-Jae So, Sung-Shik Shin, Kyu-Sung Ahn, Kwang-Jick Lee

Abstract:

Introduction: Antibiotics have been used as feed additives for the growth promotion and performance in food-producing animals. However, the possibility of selection of antimicrobial resistance and the concerns of residue in animal products led to ban the use of antibiotics in farm animals at 2011 in Korea. This strategy is also adjusted to anticoccidial drugs soon but these are still allowed for the time being to use in a diet for the treatment and control for the enteric necrosis in poultry. Therefore substantial focus has been given to find alternatives to antimicrobial agents. Several phytogenic materials have been reported to have positive effects on coccidiosis. This study was to evaluate the effects on anti-coccidial effect of oregano oil based herb mixture on Eimeria spp. in poultry. Materials and Methods: A total of one day-old boiler chickens divided into six groups (each group=30 chkckens) were used in this study. The herbal mixture was fed with water freely as follows: two groups, one infected with Eimeria spp. and the other group served as controls without herbal mixture respectively; 0.2ml/L of oregano oil; 0.2ml/L of oregano oil and Sanguisorbae radix; 0.2ml/L of Sanguisorbae radix; last group was fed with dichlazuril diet as positive control. Sporulated Eimeria spp. was infected at 14 day-old. Following infection, survival rate, bloody diarrhea, OPG (oocyst per gram) and feed conversion ratios were determined. The experimental period was lasted for 4 weeks. Results: Herbal mixture feeding groups (Group 3,4,5) showed low feed conversion ratio comparing with negative control. Oregano oil group and positive control group recorded the highest survival rate. The grade of bloody diarrhea was scored 0 to 5. Herbal mixture feeding groups showed 2, 3 and 1 score respectively however, group 2 (infection and no-treatment) showed 4. OPG results in herbal mixture feeding group were 3 to 4 times higher than diclazuril diet feeding group. Conclusions: These results showed that oregano oil and Sanguisorbae radix mixture may have an anti-coccidial effect and also affect chick performance.

Keywords: anticoccidial effects, oregano oil based herb mixture, herbal mixture, antibiotics

Procedia PDF Downloads 547
6922 Designing the Maturity Model of Smart Digital Transformation through the Foundation Data Method

Authors: Mohammad Reza Fazeli

Abstract:

Nowadays, the fourth industry, known as the digital transformation of industries, is seen as one of the top subjects in the history of structural revolution, which has led to the high-tech and tactical dominance of the organization. In the face of these profits, the undefined and non-transparent nature of the after-effects of investing in digital transformation has hindered many organizations from attempting this area of this industry. One of the important frameworks in the field of understanding digital transformation in all organizations is the maturity model of digital transformation. This model includes two main parts of digital transformation maturity dimensions and digital transformation maturity stages. Mediating factors of digital maturity and organizational performance at the individual (e.g., motivations, attitudes) and at the organizational level (e.g., organizational culture) should be considered. For successful technology adoption processes, organizational development and human resources must go hand in hand and be supported by a sound communication strategy. Maturity models are developed to help organizations by providing broad guidance and a roadmap for improvement. However, as a result of a systematic review of the literature and its analysis, it was observed that none of the 18 maturity models in the field of digital transformation fully meet all the criteria of appropriateness, completeness, clarity, and objectivity. A maturity assessment framework potentially helps systematize assessment processes that create opportunities for change in processes and organizations enabled by digital initiatives and long-term improvements at the project portfolio level. Cultural characteristics reflecting digital culture are not systematically integrated, and specific digital maturity models for the service sector are less clearly presented. It is also clearly evident that research on the maturity of digital transformation as a holistic concept is scarce and needs more attention in future research.

Keywords: digital transformation, organizational performance, maturity models, maturity assessment

Procedia PDF Downloads 78
6921 Series Network-Structured Inverse Models of Data Envelopment Analysis: Pitfalls and Solutions

Authors: Zohreh Moghaddas, Morteza Yazdani, Farhad Hosseinzadeh

Abstract:

Nowadays, data envelopment analysis (DEA) models featuring network structures have gained widespread usage for evaluating the performance of production systems and activities (Decision-Making Units (DMUs)) across diverse fields. By examining the relationships between the internal stages of the network, these models offer valuable insights to managers and decision-makers regarding the performance of each stage and its impact on the overall network. To further empower system decision-makers, the inverse data envelopment analysis (IDEA) model has been introduced. This model allows the estimation of crucial information for estimating parameters while keeping the efficiency score unchanged or improved, enabling analysis of the sensitivity of system inputs or outputs according to managers' preferences. This empowers managers to apply their preferences and policies on resources, such as inputs and outputs, and analyze various aspects like production, resource allocation processes, and resource efficiency enhancement within the system. The results obtained can be instrumental in making informed decisions in the future. The top result of this study is an analysis of infeasibility and incorrect estimation that may arise in the theory and application of the inverse model of data envelopment analysis with network structures. By addressing these pitfalls, novel protocols are proposed to circumvent these shortcomings effectively. Subsequently, several theoretical and applied problems are examined and resolved through insightful case studies.

Keywords: inverse models of data envelopment analysis, series network, estimation of inputs and outputs, efficiency, resource allocation, sensitivity analysis, infeasibility

Procedia PDF Downloads 33
6920 Interaction between Space Syntax and Agent-Based Approaches for Vehicle Volume Modelling

Authors: Chuan Yang, Jing Bie, Panagiotis Psimoulis, Zhong Wang

Abstract:

Modelling and understanding vehicle volume distribution over the urban network are essential for urban design and transport planning. The space syntax approach was widely applied as the main conceptual and methodological framework for contemporary vehicle volume models with the help of the statistical method of multiple regression analysis (MRA). However, the MRA model with space syntax variables shows a limitation in vehicle volume predicting in accounting for the crossed effect of the urban configurational characters and socio-economic factors. The aim of this paper is to construct models by interacting with the combined impact of the street network structure and socio-economic factors. In this paper, we present a multilevel linear (ML) and an agent-based (AB) vehicle volume model at an urban scale interacting with space syntax theoretical framework. The ML model allowed random effects of urban configurational characteristics in different urban contexts. And the AB model was developed with the incorporation of transformed space syntax components of the MRA models into the agents’ spatial behaviour. Three models were implemented in the same urban environment. The ML model exhibit superiority over the original MRA model in identifying the relative impacts of the configurational characters and macro-scale socio-economic factors that shape vehicle movement distribution over the city. Compared with the ML model, the suggested AB model represented the ability to estimate vehicle volume in the urban network considering the combined effects of configurational characters and land-use patterns at the street segment level.

Keywords: space syntax, vehicle volume modeling, multilevel model, agent-based model

Procedia PDF Downloads 127
6919 A Machine Learning Approach for Intelligent Transportation System Management on Urban Roads

Authors: Ashish Dhamaniya, Vineet Jain, Rajesh Chouhan

Abstract:

Traffic management is one of the gigantic issue in most of the urban roads in al-most all metropolitan cities in India. Speed is one of the critical traffic parameters for effective Intelligent Transportation System (ITS) implementation as it decides the arrival rate of vehicles on an intersection which are majorly the point of con-gestions. The study aimed to leverage Machine Learning (ML) models to produce precise predictions of speed on urban roadway links. The research objective was to assess how categorized traffic volume and road width, serving as variables, in-fluence speed prediction. Four tree-based regression models namely: Decision Tree (DT), Random Forest (RF), Extra Tree (ET), and Extreme Gradient Boost (XGB)are employed for this purpose. The models' performances were validated using test data, and the results demonstrate that Random Forest surpasses other machine learning techniques and a conventional utility theory-based model in speed prediction. The study is useful for managing the urban roadway network performance under mixed traffic conditions and effective implementation of ITS.

Keywords: stream speed, urban roads, machine learning, traffic flow

Procedia PDF Downloads 54
6918 The Model Establishment and Analysis of TRACE/FRAPTRAN for Chinshan Nuclear Power Plant Spent Fuel Pool

Authors: J. R. Wang, H. T. Lin, Y. S. Tseng, W. Y. Li, H. C. Chen, S. W. Chen, C. Shih

Abstract:

TRACE is developed by U.S. NRC for the nuclear power plants (NPPs) safety analysis. We focus on the establishment and application of TRACE/FRAPTRAN/SNAP models for Chinshan NPP (BWR/4) spent fuel pool in this research. The geometry is 12.17 m × 7.87 m × 11.61 m for the spent fuel pool. In this study, there are three TRACE/SNAP models: one-channel, two-channel, and multi-channel TRACE/SNAP model. Additionally, the cooling system failure of the spent fuel pool was simulated and analyzed by using the above models. According to the analysis results, the peak cladding temperature response was more accurate in the multi-channel TRACE/SNAP model. The results depicted that the uncovered of the fuels occurred at 2.7 day after the cooling system failed. In order to estimate the detailed fuel rods performance, FRAPTRAN code was used in this research. According to the results of FRAPTRAN, the highest cladding temperature located on the node 21 of the fuel rod (the highest node at node 23) and the cladding burst roughly after 3.7 day.

Keywords: TRACE, FRAPTRAN, BWR, spent fuel pool

Procedia PDF Downloads 346
6917 Analytical Description of Disordered Structures in Continuum Models of Pattern Formation

Authors: Gyula I. Tóth, Shaho Abdalla

Abstract:

Even though numerical simulations indeed have a significant precursory/supportive role in exploring the disordered phase displaying no long-range order in pattern formation models, studying the stability properties of this phase and determining the order of the ordered-disordered phase transition in these models necessitate an analytical description of the disordered phase. First, we will present the results of a comprehensive statistical analysis of a large number (1,000-10,000) of numerical simulations in the Swift-Hohenberg model, where the bulk disordered (or amorphous) phase is stable. We will show that the average free energy density (over configurations) converges, while the variance of the energy density vanishes with increasing system size in numerical simulations, which suggest that the disordered phase is a thermodynamic phase (i.e., its properties are independent of the configuration in the macroscopic limit). Furthermore, the structural analysis of this phase in the Fourier space suggests that the phase can be modeled by a colored isotropic Gaussian noise, where any instant of the noise describes a possible configuration. Based on these results, we developed the general mathematical framework of finding a pool of solutions to partial differential equations in the sense of continuous probability measure, which we will present briefly. Applying the general idea to the Swift-Hohenberg model we show, that the amorphous phase can be found, and its properties can be determined analytically. As the general mathematical framework is not restricted to continuum theories, we hope that the proposed methodology will open a new chapter in studying disordered phases.

Keywords: fundamental theory, mathematical physics, continuum models, analytical description

Procedia PDF Downloads 121
6916 Numerical Investigation of Embankments for Protecting Rock Fall

Authors: Gökhan Altay, Cafer Kayadelen

Abstract:

Rock fall is a movement of huge rock blocks from dip slopes due to physical effects. It generally occurs where loose tuffs lying under basalt flow or stringcourse is being constituted by limestone layers which stand on clay. By corrosion of some parts, big cracks occur on layers and these cracks continue to grow with the effect of freezing-thawing. In this way, the breaking rocks fall down from these dip slopes. Earthquakes which can induce lots of rock movements is another reason for rock fall events. In Turkey, we have a large number of regions prone to the earthquake as in the World so this increases the possibility of rock fall events. A great number of rock fall events take place in Turkey as in the World every year. The rock fall events occurring in urban areas cause serious damages in houses, roads and workplaces. Sometimes it also hinders transportation and furthermore it maybe kills people. In Turkey, rock fall events happen mostly in Spring and Winter because of freezing- thawing of water in rock cracks frequently. In mountain and inclined areas, rock fall is risky for engineering construction and environment. Some countries can invest significant money for these risky areas. For instance, in Switzerland, approximately 6.7 million dollars is spent annually for a distance of 4 km, to the systems to prevent rock fall events. In Turkey, we have lots of urban areas and engineering structure that have the rock fall risk. The embankments are preferable for rock fall events because of its low maintenance and repair costs. Also, embankments are able to absorb much more energy according to other protection systems. The current design method of embankments is only depended on field tests results so there are inadequate studies about this design method. In this paper, the field test modeled in three dimensions and analysis are carried out with the help of ANSYS programme. By the help of field test from literature the numerical model validated. After the validity of numerical models additional parametric studies performed. Changes in deformation of embankments are investigated by the changes in, geometry, velocity and impact height of falling rocks.

Keywords: ANSYS, embankment, impact height, numerical analysis, rock fall

Procedia PDF Downloads 504
6915 Numerical Investigation of the Jacketing Method of Reinforced Concrete Column

Authors: S. Boukais, A. Nekmouche, N. Khelil, A. Kezmane

Abstract:

The first intent of this study is to develop a finite element model that can predict correctly the behavior of the reinforced concrete column. Second aim is to use the finite element model to investigate and evaluate the effect of the strengthening method by jacketing of the reinforced concrete column, by considering different interface contact between the old and the new concrete. Four models were evaluated, one by considering perfect contact, the other three models by using friction coefficient of 0.1, 0.3 and 0.5. The simulation was carried out by using Abaqus software. The obtained results show that the jacketing reinforcement led to significant increase of the global performance of the behavior of the simulated reinforced concrete column.

Keywords: strengthening, jacketing, rienforced concrete column, Abaqus, simulation

Procedia PDF Downloads 135
6914 Recurrence of Papillary Thyroid Cancer with an Interval of 40 Years. Report of an Autopsy Case

Authors: Satoshi Furukawa, Satomu Morita, Katsuji Nishi, Masahito Hitosugi

Abstract:

A 75-year-old woman took thyroidectomy forty years previously. Enlarged masses were seen at autopsy just above and below the left clavicle. We proved the diagnosis of papillary thyroid cancer (PTC) and lung metastasis by histological examinations. The prognosis of PTC is excellent; the 10-year survival rate ranges between 85 and 99%. Lung metastases may be found in 10% of the patients with PTC. We report an unusual case of recurrence of PTC with metastasis to the lung.

Keywords: papillary thyroid cancer, lung metastasis, autopsy, histopathological findings

Procedia PDF Downloads 329
6913 Seismic Hazard Assessment of Offshore Platforms

Authors: F. D. Konstandakopoulou, G. A. Papagiannopoulos, N. G. Pnevmatikos, G. D. Hatzigeorgiou

Abstract:

This paper examines the effects of pile-soil-structure interaction on the dynamic response of offshore platforms under the action of near-fault earthquakes. Two offshore platforms models are investigated, one with completely fixed supports and one with piles which are clamped into deformable layered soil. The soil deformability for the second model is simulated using non-linear springs. These platform models are subjected to near-fault seismic ground motions. The role of fault mechanism on platforms’ response is additionally investigated, while the study also examines the effects of different angles of incidence of seismic records on the maximum response of each platform.

Keywords: hazard analysis, offshore platforms, earthquakes, safety

Procedia PDF Downloads 135
6912 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations

Authors: Ramon Santana

Abstract:

The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.

Keywords: fingerprint, template protection, bio-cryptography, minutiae protection

Procedia PDF Downloads 162
6911 Comparative Study between Mesenchymal Stem Cells and Regulatory T-Cells in Macrophage Polarization for Organ Transplant Tolerance: In Vitro Study

Authors: Vijaya Madhuri Devraj, Swarnalatha Guditi, Kiran Kumar Bokara, Gangadhar Taduri

Abstract:

Cell-based strategies may open therapeutic approaches that promote tolerance through manipulation of macrophages to increase long-term transplant survival rates and minimize side effects of the current immune suppressive regimens. The aim of the present study was, therefore, to test and compare the therapeutic potential of MSC and Tregs on macrophage polarization to develop an alternate cell-based treatment option in kidney transplantation. In the current protocol, macrophages from kidney transplant recipients with graft dysfunction were co-cultured with MSCs and Treg cells with and without cell-cell contact on transwell plates, further to quantitatively assess macrophage polarization in response to MSC and Treg treatment over time, M1 and M2 cell surface markers were used. Additionally, multiple soluble analytes were analyzed in cell supernatant by using bead-based immunoassays. Furthermore, to confirm our findings, gene expression analysis was done. MSCs induced the formation of M2 macrophages more than Tregs when macrophages M0 were cultured in transwell without cell contact. From this, we deduced the mechanism that soluble factors present in the MSCs condition media are involved in skewing of macrophages towards type 2 macrophages; similarly, in co-culture with cell-cell contact, MSCs resulted in more M2 type macrophages than Tregs. And an important finding of this study is the combination of both MSC-Treg showed significantly effective and consistent results in both with and without cell contact setups. Hence, it is suggestive to prefer MSCs over Tregs for secretome-based therapy and a combination of both for either therapy for effective transplantation outcomes. Our findings underline a key role of Tregs and MSCs in promoting macrophage polarization towards anti-inflammatory type. The study has great importance in prolongation of allograft and patient survival without any rejection by cell-based therapy, which induce self-tolerance and controlling infection.

Keywords: graft rejection, graft tolerance, macrophage polarization, mesenchymal stem cells, regulatory T cells, transplant immunology

Procedia PDF Downloads 106
6910 Segregation Patterns of Trees and Grass Based on a Modified Age-Structured Continuous-Space Forest Model

Authors: Jian Yang, Atsushi Yagi

Abstract:

Tree-grass coexistence system is of great importance for forest ecology. Mathematical models are being proposed to study the dynamics of tree-grass coexistence and the stability of the systems. However, few of the models concentrates on spatial dynamics of the tree-grass coexistence. In this study, we modified an age-structured continuous-space population model for forests, obtaining an age-structured continuous-space population model for the tree-grass competition model. In the model, for thermal competitions, adult trees can out-compete grass, and grass can out-compete seedlings. We mathematically studied the model to make sure tree-grass coexistence solutions exist. Numerical experiments demonstrated that a fraction of area that trees or grass occupies can affect whether the coexistence is stable or not. We also tried regulating the mortality of adult trees with other parameters and the fraction of area trees and grass occupies were fixed; results show that the mortality of adult trees is also a factor affecting the stability of the tree-grass coexistence in this model.

Keywords: population-structured models, stabilities of ecosystems, thermal competitions, tree-grass coexistence systems

Procedia PDF Downloads 143
6909 Comparison of Applicability of Time Series Forecasting Models VAR, ARCH and ARMA in Management Science: Study Based on Empirical Analysis of Time Series Techniques

Authors: Muhammad Tariq, Hammad Tahir, Fawwad Mahmood Butt

Abstract:

Purpose: This study attempts to examine the best forecasting methodologies in the time series. The time series forecasting models such as VAR, ARCH and the ARMA are considered for the analysis. Methodology: The Bench Marks or the parameters such as Adjusted R square, F-stats, Durban Watson, and Direction of the roots have been critically and empirically analyzed. The empirical analysis consists of time series data of Consumer Price Index and Closing Stock Price. Findings: The results show that the VAR model performed better in comparison to other models. Both the reliability and significance of VAR model is highly appreciable. In contrary to it, the ARCH model showed very poor results for forecasting. However, the results of ARMA model appeared double standards i.e. the AR roots showed that model is stationary and that of MA roots showed that the model is invertible. Therefore, the forecasting would remain doubtful if it made on the bases of ARMA model. It has been concluded that VAR model provides best forecasting results. Practical Implications: This paper provides empirical evidences for the application of time series forecasting model. This paper therefore provides the base for the application of best time series forecasting model.

Keywords: forecasting, time series, auto regression, ARCH, ARMA

Procedia PDF Downloads 332
6908 Use of SUDOKU Design to Assess the Implications of the Block Size and Testing Order on Efficiency and Precision of Dulce De Leche Preference Estimation

Authors: Jéssica Ferreira Rodrigues, Júlio Silvio De Sousa Bueno Filho, Vanessa Rios De Souza, Ana Carla Marques Pinheiro

Abstract:

This study aimed to evaluate the implications of the block size and testing order on efficiency and precision of preference estimation for Dulce de leche samples. Efficiency was defined as the inverse of the average variance of pairwise comparisons among treatments. Precision was defined as the inverse of the variance of treatment means (or effects) estimates. The experiment was originally designed to test 16 treatments as a series of 8 Sudoku 16x16 designs being 4 randomized independently and 4 others in the reverse order, to yield balance in testing order. Linear mixed models were assigned to the whole experiment with 112 testers and all their grades, as well as their partially balanced subgroups, namely: a) experiment with the four initial EU; b) experiment with EU 5 to 8; c) experiment with EU 9 to 12; and b) experiment with EU 13 to 16. To record responses we used a nine-point hedonic scale, it was assumed a mixed linear model analysis with random tester and treatments effects and with fixed test order effect. Analysis of a cumulative random effects probit link model was very similar, with essentially no different conclusions and for simplicity, we present the results using Gaussian assumption. R-CRAN library lme4 and its function lmer (Fit Linear Mixed-Effects Models) was used for the mixed models and libraries Bayesthresh (default Gaussian threshold function) and ordinal with the function clmm (Cumulative Link Mixed Model) was used to check Bayesian analysis of threshold models and cumulative link probit models. It was noted that the number of samples tested in the same session can influence the acceptance level, underestimating the acceptance. However, proving a large number of samples can help to improve the samples discrimination.

Keywords: acceptance, block size, mixed linear model, testing order, testing order

Procedia PDF Downloads 312
6907 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors

Authors: Sudhir Kumar Singh, Debashish Chakravarty

Abstract:

Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.

Keywords: finite element method, geotechnical engineering, machine learning, slope stability

Procedia PDF Downloads 92
6906 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques

Authors: Soheila Sadeghi

Abstract:

In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.

Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes

Procedia PDF Downloads 23
6905 Churn Prediction for Savings Bank Customers: A Machine Learning Approach

Authors: Prashant Verma

Abstract:

Commercial banks are facing immense pressure, including financial disintermediation, interest rate volatility and digital ways of finance. Retaining an existing customer is 5 to 25 less expensive than acquiring a new one. This paper explores customer churn prediction, based on various statistical & machine learning models and uses under-sampling, to improve the predictive power of these models. The results show that out of the various machine learning models, Random Forest which predicts the churn with 78% accuracy, has been found to be the most powerful model for the scenario. Customer vintage, customer’s age, average balance, occupation code, population code, average withdrawal amount, and an average number of transactions were found to be the variables with high predictive power for the churn prediction model. The model can be deployed by the commercial banks in order to avoid the customer churn so that they may retain the funds, which are kept by savings bank (SB) customers. The article suggests a customized campaign to be initiated by commercial banks to avoid SB customer churn. Hence, by giving better customer satisfaction and experience, the commercial banks can limit the customer churn and maintain their deposits.

Keywords: savings bank, customer churn, customer retention, random forests, machine learning, under-sampling

Procedia PDF Downloads 130
6904 Auto Rickshaw Impacts with Pedestrians: A Computational Analysis of Post-Collision Kinematics and Injury Mechanics

Authors: A. J. Al-Graitti, G. A. Khalid, P. Berthelson, A. Mason-Jones, R. Prabhu, M. D. Jones

Abstract:

Motor vehicle related pedestrian road traffic collisions are a major road safety challenge, since they are a leading cause of death and serious injury worldwide, contributing to a third of the global disease burden. The auto rickshaw, which is a common form of urban transport in many developing countries, plays a major transport role, both as a vehicle for hire and for private use. The most common auto rickshaws are quite unlike ‘typical’ four-wheel motor vehicle, being typically characterised by three wheels, a non-tilting sheet-metal body or open frame construction, a canvas roof and side curtains, a small drivers’ cabin, handlebar controls and a passenger space at the rear. Given the propensity, in developing countries, for auto rickshaws to be used in mixed cityscapes, where pedestrians and vehicles share the roadway, the potential for auto rickshaw impacts with pedestrians is relatively high. Whilst auto rickshaws are used in some Western countries, their limited number and spatial separation from pedestrian walkways, as a result of city planning, has not resulted in significant accident statistics. Thus, auto rickshaws have not been subject to the vehicle impact related pedestrian crash kinematic analyses and/or injury mechanics assessment, typically associated with motor vehicle development in Western Europe, North America and Japan. This study presents a parametric analysis of auto rickshaw related pedestrian impacts by computational simulation, using a Finite Element model of an auto rickshaw and an LS-DYNA 50th percentile male Hybrid III Anthropometric Test Device (dummy). Parametric variables include auto rickshaw impact velocity, auto rickshaw impact region (front, centre or offset) and relative pedestrian impact position (front, side and rear). The output data of each impact simulation was correlated against reported injury metrics, Head Injury Criterion (front, side and rear), Neck injury Criterion (front, side and rear), Abbreviated Injury Scale and reported risk level and adds greater understanding to the issue of auto rickshaw related pedestrian injury risk. The parametric analyses suggest that pedestrians are subject to a relatively high risk of injury during impacts with an auto rickshaw at velocities of 20 km/h or greater, which during some of the impact simulations may even risk fatalities. The present study provides valuable evidence for informing a series of recommendations and guidelines for making the auto rickshaw safer during collisions with pedestrians. Whilst it is acknowledged that the present research findings are based in the field of safety engineering and may over represent injury risk, compared to “Real World” accidents, many of the simulated interactions produced injury response values significantly greater than current threshold curves and thus, justify their inclusion in the study. To reduce the injury risk level and increase the safety of the auto rickshaw, there should be a reduction in the velocity of the auto rickshaw and, or, consideration of engineering solutions, such as retro fitting injury mitigation technologies to those auto rickshaw contact regions which are the subject of the greatest risk of producing pedestrian injury.

Keywords: auto rickshaw, finite element analysis, injury risk level, LS-DYNA, pedestrian impact

Procedia PDF Downloads 185
6903 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction

Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan

Abstract:

Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.

Keywords: decision trees, neural network, myocardial infarction, Data Mining

Procedia PDF Downloads 417
6902 Signs-Only Compressed Row Storage Format for Exact Diagonalization Study of Quantum Fermionic Models

Authors: Michael Danilov, Sergei Iskakov, Vladimir Mazurenko

Abstract:

The present paper describes a high-performance parallel realization of an exact diagonalization solver for quantum-electron models in a shared memory computing system. The proposed algorithm contains a storage format for efficient computing eigenvalues and eigenvectors of a quantum electron Hamiltonian matrix. The results of the test calculations carried out for 15 sites Hubbard model demonstrate reduction in the required memory and good multiprocessor scalability, while maintaining performance of the same order as compressed row storage.

Keywords: sparse matrix, compressed format, Hubbard model, Anderson model

Procedia PDF Downloads 386